| Literature DB >> 34847190 |
Sungmin Moon1, Mallory A Jackson1, Jennifer H Doherty1, Mary Pat Wenderoth1.
Abstract
Evidence-based teaching practices are associated with improved student academic performance. However, these practices encompass a wide range of activities and determining which type, intensity or duration of activity is effective at improving student exam performance has been elusive. To address this shortcoming, we used a previously validated classroom observation tool, Practical Observation Rubric to Assess Active Learning (PORTAAL) to measure the presence, intensity, and duration of evidence-based teaching practices in a retrospective study of upper and lower division biology courses. We determined the cognitive challenge of exams by categorizing all exam questions obtained from the courses using Bloom's Taxonomy of Cognitive Domains. We used structural equation modeling to correlate the PORTAAL practices with exam performance while controlling for cognitive challenge of exams, students' GPA at start of the term, and students' demographic factors. Small group activities, randomly calling on students or groups to answer questions, explaining alternative answers, and total time students were thinking, working with others or answering questions had positive correlations with exam performance. On exams at higher Bloom's levels, students explaining the reasoning underlying their answers, students working alone, and receiving positive feedback from the instructor also correlated with increased exam performance. Our study is the first to demonstrate a correlation between the intensity or duration of evidence-based PORTAAL practices and student exam performance while controlling for Bloom's level of exams, as well as looking more specifically at which practices correlate with performance on exams at low and high Bloom's levels. This level of detail will provide valuable insights for faculty as they prioritize changes to their teaching. As we found that multiple PORTAAL practices had a positive association with exam performance, it may be encouraging for instructors to realize that there are many ways to benefit students' learning by incorporating these evidence-based teaching practices.Entities:
Mesh:
Year: 2021 PMID: 34847190 PMCID: PMC8631643 DOI: 10.1371/journal.pone.0260789
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Instructor and course information.
| Lower Division | Upper Division | Total | ||
|---|---|---|---|---|
|
| 4 | 6 | 10 | |
|
|
|
| ||
|
| 16 | 10 | 6 | 32 |
|
| 30 | 10 | 6 | 46 |
|
| 11,218 | 3,415 | 323 | 14,956 |
PORTAAL practices.
| Dimension | PORTAAL Practices | Duration (D)/Instance (I) |
|---|---|---|
|
| (1) Total time (minutes) students were thinking, working, talking [TST] | D |
| (2) | I | |
|
| (3) | I |
| (4) | I | |
| (5) | I | |
| (6) Amount of time in debrief [DB] | D | |
| (7) Amount of time that students talked in debrief [ST_DB] | D | |
| (8) | I | |
| (9) | I | |
| (10) | D | |
| (11) | I | |
| (12) | I | |
|
| (13) | I |
|
| (14) | I |
|
| (15) | I |
| (16) | I | |
| (17) | I | |
| (18) | I | |
| (19) | I | |
| (20) | I | |
| (21) | I |
Practices from PORTAAL that improve student academic performance based on evidence from the literature. Practices are clustered in four dimensions. Seven practices were removed before conducting SEM. Practices were coded as either intensity (instances) or duration (minutes).
Fig 1Structural equation model path diagram of PORTAAL practices for all units of analysis.
Standardized path coefficients are in red. The effects of covariates and interactions are in blue. Residual variances in exam scores not explained by this model are in black. When controlling for the effects (in blue) of covariates and interactions, student exam scores would change by 0.047 standard deviations (0.047×13.08 = 0.61) given a one standard deviation change in the number of small group activities (4.36) while all other evidence-based teaching practices were held constant. n = 42. Significant relationships are marked with * in this diagram. *p < 0.05, **p < 0.01, ***p < 0.001.
The relationships between the change in PORTAAL practices and the expected change in exam scores for analyses of both research questions.
| PORTAAL Practice | Research Question 1 | Research Question 2 | |
|---|---|---|---|
| All Exams | Low Bloom’s Exams | High Bloom’s Exams | |
| Small Group | 0.14 | 0.39 | - |
| 0.1 | 0.21 | - | |
| Random Call Answers | 0.25 | -0.71 | - |
| Alternative Answers | 1.18 | 1.32 | 0.36 |
| Working Alone | - | -0.54 | 0.43 |
| Total Student Time | 0.16 | 0.35 | - |
| Student Time in Debrief | - | 0.73 | - |
| Explaining Answers | - | - | 0.20 |
| - | - | 0.8 | |
| Positive Feedback | - | - | 0.24 |
Expected change in exam scores (percentage points) predicted by one-unit (instance or minute) change in each practice. Mediators are in parentheses (F: female, M: male). Bloom’s level of exam, GPA, EOP status, URM status, and interactions between the student demographic factors are included as covariates. Cells with a dash indicate no significant correlation.
*p < 0.05,
**p < 0.01,
***p < 0.001.
Fig 2SEM diagram for units with low Bloom’s level exams.
Standardized path coefficients are in red. The effects of covariates and interactions are in blue. Residual variances in exam scores not explained by this model are in black. n = 22. Significant relationships are marked with * in this diagram. *p < 0.05, **p < 0.01, ***p < 0.001.
Fig 3SEM diagram for units with high Bloom’s level exams.
Standardized path coefficients are in red. The effects of covariates and interactions are in blue. Residual variances in exam scores not explained by this model are in black. n = 20. Significant relationships are marked with * in this diagram. *p < 0.05, **p < 0.01, ***p < 0.001.