Researching for better instructional methods using AB experiments in MOOCs: results and challenges

Zhongzhou Chen, Christopher Chudzicki, Daniel Palumbo, Giora Alexandron, Youn Jeng Choi, Qian Zhou, David E. Pritchard

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

We conducted two AB experiments (treatment vs. control) in a massive open online course. The first experiment evaluates deliberate practice activities (DPAs) for developing problem solving expertise as measured by traditional physics problems. We find that a more interactive drag-and-drop format of DPA generates quicker learning than a multiple choice format but DPAs do not improve performance on solving traditional physics problems more than normal homework practice. The second experiment shows that a different video shooting setting can improve the fluency of the instructor which in turn improves the engagement of the students although it has no significant impact on the learning outcomes. These two cases demonstrate the potential of MOOC AB experiments as an open-ended research tool but also reveal limitations. We discuss the three most important challenges: wide student distribution, “open-book” nature of assessments, and large quantity and variety of data. We suggest possible methods to cope with those.

Original languageEnglish
Article number9
JournalResearch and Practice in Technology Enhanced Learning
Volume11
Issue number1
DOIs
StatePublished - 1 Dec 2016

Keywords

  • Cognitive Load
  • Deliberate Practice
  • Extraneous Cognitive Load
  • Multiple Choice Format
  • Traditional Problem

Fingerprint

Dive into the research topics of 'Researching for better instructional methods using AB experiments in MOOCs: results and challenges'. Together they form a unique fingerprint.

Cite this