| Literature DB >> 31318869 |
Anne M Casper1, Sarah L Eddy2, Scott Freeman3.
Abstract
Our first two experiments on adapting a high-structure course model to an essentially open-enrollment university produced negative or null results. Our third experiment, however, proved more successful: performance improved for all students, and a large achievement gap that impacted underrepresented minority students under traditional lecturing closed. Although the successful design included preclass preparation videos, intensive active learning in class, and weekly practice exams, student self-report data indicated that total study time decreased. Faculty who have the grit to experiment and persevere in making evidence-driven changes to their teaching can reduce the inequalities induced by economic and educational disadvantage.Entities:
Mesh:
Year: 2019 PMID: 31318869 PMCID: PMC6667208 DOI: 10.1371/journal.pbio.3000359
Source DB: PubMed Journal: PLoS Biol ISSN: 1544-9173 Impact factor: 8.029
Changes in course design.
| During class | |||||
|---|---|---|---|---|---|
| Experimental group | Before class | Overall style | Clickers | In-class questions posed by instructor | After class |
| Control | Recommended textbook reading | Lecture | For points; answered in groups | Few, answered by volunteers | Online homework, 15–30 questions, one per week |
| Experiment 1 (one semester, two total class sections; | No change | No change | No change | No change | Online practice exam with time limit, 15 questions, one per week |
| Experiment 2 (one semester, two total class sections; | Required textbook reading, online reading quiz, short-answer homework due at beginning of class (hand-graded by GAs) | Lecture-free; group worksheets and activities | For points; answered alone first, then peer instruction | Many, random call for answers | Online practice exam without time limit, 15 questions, one per week |
| Experiment 3 (three semesters, four total class sections; | Required video, closed-notes short-answer quiz over video at beginning of class (peer-graded) | No change | No change | No change | Online practice exam without time limit, 20–30 questions, one per week |
In each case, “No change” indicates no changes from the previous version of the class.
Abbreviation: GA, graduate student assistant
Fig 1Students in Exp. 3 had increased exam performance and lower failure rates.
The data points are model-predicted average exam points earned on each exam (A) and percent chance of earning a D or F or withdrawing from the course (B) (S1 Data). Predictions are for students with average ACT scores in the study sample. Error bars represent the standard error around the estimate. Exp., experiment; W, withdrawal.
Fig 2Achievement gaps shrank in experiment 3.
The bars are model-predicted average exam points earned on each exam (A) and percent chance of earning a D or F or withdrawing from the course (B) (S1 Data). Predictions are for students in these courses, with the average score for the summary variable based on ACT reading, math, English, and science reasoning scores. Error bars represent the standard error around the estimate. URM, underrepresented minority; W, withdrawal.
Fig 3Students in experiment 3 spent less time studying.
The boxes indicate the first quartile, median, and third quartile of the raw data (not model-predicted) of self-reported time spent studying per week for introductory biology; the whiskers represent 1.5 times the interquartile range; the dots are data points outside this range (S1 Data). Although we do not have data from control semesters, the course format was identical to that of experiment 1 except for replacing homework problems with a timed practice exam (Table 1).
Student evaluations of teaching.
| Semester | Percent of students selecting much below or below average | Percent of students selecting average | Percent of students selecting above or much above average |
| Control semester 1 | 11 | 25 | 64 |
| Control semester 2 | 13 | 27 | 60 |
| Experiment 1 | 9 | 23 | 68 |
| Experiment 2 | 14 | 30 | 55 |
| Experiment 3 semester 1 | 8 | 19 | 73 |
| Experiment 3 semester 2 | 2 | 13 | 85 |
| Experiment 3 semester 3 | 5 | 36 | 58 |
| Semester | Percent of students selecting much below or below average | Percent of students selecting average | Percent of students selecting above or much above average |
| Control semester 1 | 10 | 26 | 64 |
| Control semester 2 | 10 | 18 | 72 |
| Experiment 1 | 13 | 15 | 73 |
| Experiment 2 | 13 | 19 | 68 |
| Experiment 3 semester 1 | 4 | 13 | 82 |
| Experiment 3 semester 2 | 2 | 13 | 85 |
| Experiment 3 semester 3 | 12 | 8 | 79 |
Course evaluation response rates were 58% in control term 1, 55% in control term 2, 61% in the experiment 1 term, 68% in the experiment 2 term, 65% in experiment 3 term 1, 71% in experiment 3 term 2, and 75% in experiment 3 term 3.