| Literature DB >> 32338618 |
Abrar Alturkistani1, Ching Lam2, Kimberley Foley1, Terese Stenfors3, Elizabeth R Blum3, Michelle Helena Van Velthoven2, Edward Meinert1,2.
Abstract
BACKGROUND: Massive open online courses (MOOCs) have the potential to make a broader educational impact because many learners undertake these courses. Despite their reach, there is a lack of knowledge about which methods are used for evaluating these courses.Entities:
Keywords: computer-assisted instruction; learning; online learning
Mesh:
Year: 2020 PMID: 32338618 PMCID: PMC7215503 DOI: 10.2196/13851
Source DB: PubMed Journal: J Med Internet Res ISSN: 1438-8871 Impact factor: 5.428
Figure 1A Preferred Reporting Items for Systematic Reviews and Meta-Analyses flowchart of the literature search.
The aim of the massive open online course evaluations for the included studies.
| Evaluation aim focus, subcategories | Studies | Number of studies | |
|
| |||
|
| Learner expectations | [ | 3 |
| Learner characteristics and behavior | [ | 16 | |
| Learner engagement | [ | 4 | |
| Participation or completion rates | [ | 6 | |
| Learner satisfaction | [ | 2 | |
| Peer interaction | [ | 4 | |
| Learning outcomes and experience | [ | 20 | |
| Knowledge retention | [ | 1 | |
|
| |||
|
| Pedagogical practices | [ | 2 |
|
| |||
|
| Comparison with other learning platforms | [ | 4 |
| MOOC content and structure | [ | 1 | |
| Implementation of MOOC | [ | 5 | |
| Sustainability of MOOC | [ | 1 | |
aMOOC: massive open online course.
Studies using different data sources (N=33).
| Data source | Value, n (%) |
| Surveys | 20 (30.8) |
| Interviews | 8 (12.3) |
| Learning Management System | 18 (27.7) |
| Discussions | 5 (7.7) |
| Quizzes | 9 (13.8) |
| Other | 5 (7.7) |
Data collection methods and their uses in massive open online course evaluations.
| Data | Uses |
| Registration form | To collect demographic information [ |
| Pre-MOOCa survey | To collect data on the following: demographic information [ |
| Pretest | To collect baseline test scores for comparison with posttest scores [ |
| Learning management system data | To collect data on the following: demographic information [ |
| Discussion posts | Feedback about the course [ |
| Quiz, homework, or test (not specified as pre- or postquiz or test) | Grades to assess learning [ |
| Post-MOOC survey | To collect demographic information [ |
| Posttest | To assess learning [ |
| End of MOOC quiz | To record learners’ feedback in relation to the course material (whether the course helped them become |
| Postcourse interview | Course participation and evaluation [ |
| Email interview | To understand learners’ behavior and learning in MOOCs [ |
| Online focus group | Assessment of the course: organization, assessment, |
aMOOC: massive open online course.
Data collection method uses mentioned earlier and how they were analyzed in massive open online course evaluations.
| Data collection method uses, parameters or themes reported | Data analysis methods | |
|
| ||
|
|
Learning [ Learning performance [ “Learning outcome” [ Learning [ Overall learner ability in the course [ The students’ gains in comprehensibility [ Subject-matter knowledge [ Comprehensibility of learner audio recordings in a language MOOCa [ Knowledge retention [ Learning performance [ |
Calculation of the mean difference between pretest and posttest scores [ Compare pretest and posttest scores using a paired Descriptive statistics [ “Regressing quiz and homework score on participation and MOOC experience” [ Calculation of normalized gain between pretest and posttest Using the Item Response Theory analyzing pretest, posttest, and homework performance [ Knowledge test gains by calculating normalized learning gains when comparing pretest and posttest scores [ Calculation of gains in comprehensibility [ “A paired-samples Two independent sample |
|
| ||
|
|
Contributions per week, number of tweets, “quality of posted comments and learning designs,” “quality of peer feedback,” “ranking of importance of course features,” “comments received by those posting and sharing a scenario idea in Week 2” [ Determinants of completion [ Course completion rate [ The number of videos watched, video activity (play, stops, and full watch), the number of quizzes submitted, and discussion forum activity; reading in forums, the number of posts and comments, and dropout rate [ Reasons for dropping out of the course [ The number of comments per participant, completed steps, and the “likes” count [ The frequency of viewing lectures [ Learner course activity and course grade [ Satisfaction with MOOC, comfort with learning new things, and joining MOOC because of the “Love for Learning” [ |
Descriptive statistics [ Logistic regression of homework and exam outcomes [ Regression [ Frequency analysis [ |
|
| ||
|
|
Comparison of a Likert scale rating of “the technology quality and user-friendliness of the Web environment, the quality of instructional content, and the instructional arrangement,” satisfaction with interactions with instructors, satisfaction with support received, and the satisfaction of learning needs between MOOC and onsite learners [ “Perceived usefulness and ease of use” of MOOC [ “Perceived learning experience” [ Learner rating of the “usefulness and relevance of the activities” [ Overall learner attitude [ |
Comparison of “Likert scale items” using the Mann-Whitney U tests [ |
|
| ||
|
|
Student expectations (theme) [ Whether course fulfilled expectations [ |
Descriptive content analysis [ Descriptive statistics [ |
|
| ||
|
|
“Autonomous learning across distributed platforms, learning through diversity, learning through openness and interactivity, organizing learning through aggregation, co-creation, and creativity through remixing and repurposing, coping with uncertainty, and identity building” (themes) [ How learners approach “professional learning” in a MOOC, what learner behavior is exhibited by learners, and how “professionals relate their MOOC learning to their professional role” [ Factors predicting learner and student success [ Learner self-reported “assertions on learning strategies” [ |
Qualitative descriptive [ Coding of interview data [ Ordinary least squares regression using learner demographic data and knowledge data [ Descriptive statistics [ |
|
| ||
|
|
Learner course activity and course grade [ Learner rating of course perseverance [ |
1-2 frequency analysis [ |
|
| ||
|
|
Learner opinions about course effectiveness [ “What students took away from the MOOC” (theme) [ |
Using grounded-theory methods of interview data [ Descriptive content analysis [ |
|
| ||
|
|
Learner “interaction in forums” [ Learner to learner interactions [ Learner collaboration patterns [ |
Social network analysis [ Content analysis of discussions posts using the Interaction Analysis Model [ Social network analysis [ |
|
| ||
|
|
“Learning motivation” [ Learner self-reported “assertions on motivation” [ “A reason for taking or completing the course” [ Exploring the “primary motivation for taking” the course [ |
Descriptive qualitative [ Descriptive statistics [ Thematic analysis of interview data [ Emergent coding on survey data [ |
aMOOC: massive open online course.