| Literature DB >> 26033871 |
Sarah L Eddy1, Mercedes Converse2, Mary Pat Wenderoth2.
Abstract
There is extensive evidence that active learning works better than a completely passive lecture. Despite this evidence, adoption of these evidence-based teaching practices remains low. In this paper, we offer one tool to help faculty members implement active learning. This tool identifies 21 readily implemented elements that have been shown to increase student outcomes related to achievement, logic development, or other relevant learning goals with college-age students. Thus, this tool both clarifies the research-supported elements of best practices for instructor implementation of active learning in the classroom setting and measures instructors' alignment with these practices. We describe how we reviewed the discipline-based education research literature to identify best practices in active learning for adult learners in the classroom and used these results to develop an observation tool (Practical Observation Rubric To Assess Active Learning, or PORTAAL) that documents the extent to which instructors incorporate these practices into their classrooms. We then use PORTAAL to explore the classroom practices of 25 introductory biology instructors who employ some form of active learning. Overall, PORTAAL documents how well aligned classrooms are with research-supported best practices for active learning and provides specific feedback and guidance to instructors to allow them to identify what they do well and what could be improved.Entities:
Mesh:
Year: 2015 PMID: 26033871 PMCID: PMC4477739 DOI: 10.1187/cbe.14-06-0095
Source DB: PubMed Journal: CBE Life Sci Educ ISSN: 1931-7913 Impact factor: 3.325
Figure 1.PORTAAL captures one aspect of active learning: how the instructor structures the in-class experience. Active learning is a multifaceted practice that involves inputs from the instructor and students as well as events in and outside class. All these inputs influence the ultimate outcome of student learning.
Elements in the dimension of practice and the evidence supporting thema
| How element is observed in the classroom | Increases achievement | Improves conversations | Improves other measures | Citations | ||
|---|---|---|---|---|---|---|
| Dimension 1: Practice | ||||||
| Elements | P1. Frequent practice | Minutes any student has the possibility of talking through content in class | ✓ | |||
| P2. Alignment of practice and assessment | In-class practice questions at same cognitive skills level as course assessments (requires access to exams) | ✓ | ||||
| P3. Distributed practice | Percent of activities in which instructor reminds students to use prior knowledge | ✓ | ||||
| P4. Immediate feedback | Percent of activities in which instructor hears student logic and has an opportunity to respond | ✓ | ||||
aMeasures are positively correlated with dimension unless otherwise stated. All these measures were on adult learners, although they were not all in large-lecture contexts.
Elements in the dimension of logic development and the evidence supporting thema
| How element is observed in classroom | Increases achievement | Improves conversations | Improves other measures | Citations | ||
|---|---|---|---|---|---|---|
| Dimension 2: Logic Development | ||||||
| Elements | L1. Opportunities to practice higher-order skills in class | Percent of activities that require students to use higher-order cognitive skills | ✓ | |||
| L2. Prompt student to explain/defend their answers | Percent of activities in which students are reminded to use logic | ✓1–3 | ✓4–6 | ✓ 6 | 1 | |
| L3. Allow students time to think before they discuss answers | Percent of activities in which students are explicitly given time to think alone before having to talk in groups or in front of class | ✓1 | ✓2 | 1 | ||
| L4. Students explain their answers to their peers | Percent of activities in which students work in small groups during student engagement | ✓1–6 | ✓7–11 | 1 | ||
| L5. Students solve problems without hints | Percent of activities in which answer is not hinted at between iterations of student engagement. | ✓1–4 | ✓3 | 1 | ||
| L6. Students hear students describing their logic | Percent of activities in which students share their logic in front of the whole class | ✓ | ||||
| L7. Logic behind correct answer explained | Percent of activities in which correct answer is explained | ✓1– 3 | ✓2, 3 | 1 | ||
| L8. Logic behind why incorrect or partially incorrect answers are explained | Percent of activities in which alternative answers are discussed during debrief | ✓ | ||||
aMeasures are positively correlated with dimension unless otherwise stated. Citations with an (S) are student self-reported measures. All these measure were on adult learners (unless denoted with an asterisk), although they were not all in large-lecture contexts.
Elements in the dimension of accountability and the evidence supporting thema
| How element is observed in the classroom | Increases achievement | Improves conversations | Improves other measures | Citations | ||
|---|---|---|---|---|---|---|
| Dimension 3: Accountability | ||||||
| Elements | A1. Activities worth course points | Percent activities worth course points (may require a syllabus or other student data source) | ✓ correct answer1, 2 participation3 | ✓ participation4–6 | 1 | |
| A2. Activities involve small-group work, so more students have opportunity to participate | Percent activities in which students work in small groups | ✓ | ||||
| A3. Avoid volunteer bias by using cold call or random call | Percent activities in which cold or random call used | ✓ | ||||
aMeasures are positively correlated with dimension unless otherwise stated. All these measure were on adult learners, although they were not all in large-lecture contexts.
Elements in the dimension of apprehension reduction and the evidence supporting thema
| How element is observed in the classroom | Increases achievement | Improves conversations | Improves other measures | Citations | ||
|---|---|---|---|---|---|---|
| Dimension 4: Reducing apprehension | ||||||
| Elements | R1. Give students practice participating by enforcing participation through cold/random call | Percent activities with random or cold-calling used during student engagement or debrief | ✓ | |||
| R2. Student confirmation: provide praise to whole class for their work | Percent debriefs and engagements in which class received explicit positive feedback and/or encouragement | ✓1–3 | ✓3, 4 | 1 | ||
| R3. Student confirmation: provide praise/encouragement to individual students | Percent student responses with explicit positive feedback and/or encouragement | ✓1–3 | ✓3, 4 | 1 | ||
| R4. Student confirmation: do not belittle/insult student responses | Percent student responses that do not receive negative feedback | ✓1 | ✓2 | 1 | ||
| R5. Error framing: emphasize errors natural/instructional | Percent activities in which instructor reminds students that errors are nothing to be afraid of during introduction or student engagement periods | ✓ | ||||
| R6. Emphasize hard work over ability | Percent activities in which instructor explicitly praises student effort or improvement | ✓ | ||||
aCitations with an (S) are student self-reported measures. Measures are positively correlated with dimension unless otherwise stated. All these measure were on adult learners, although they were not all in large-lecture contexts.
Figure 2.Dimension 1: Practice—variation in implementation of elements. Histograms demonstrating the variation in instructor classroom practice for each element of the dimension of practice. The black dotted line is the median for the 25 instructors; the red line is the practice of the instructor who reduced student failure rate by 65%; and the blue line is the instructor who reduced failure rate by 41%. Each quartile represents where the observations from 25% of the instructors fall. Quartiles can appear to be missing if they overlap with one another (e.g., if 50% of instructors have a score of 0 for a particular element, only the third and fourth quartiles will be visible on the graphs).
Figure 3.Dimension 2: Logic development—variation in implementation of elements. Histograms demonstrating the variation in instructor classroom practice for each element of the dimension of logic development. The black dotted line is the median for the 25 instructors; the red line is the practice of the instructor who reduced student failure rate by 65%; and the blue line is the instructor who reduced failure rate by 41%. Each quartile represents where the observations from 25% of the instructors fall. Quartiles can appear to be missing if they overlap with one another.
Figure 4.Dimension 3: Accountability—variation in implementation of elements. Histograms demonstrating the variation in instructor classroom practice for each element of the dimension of accountability. The black dotted line is the median for the 25 instructors; the red line is the practice of the instructor who reduced student failure rate by 65%; and the blue line is the instructor who reduced failure rate by 41%. Each quartile represents where the observations from 25% of the instructors fall. Quartiles can appear to be missing if they overlap with one another.
Figure 5.Dimension 4: Apprehension reduction—variation in implementation of elements. Histograms demonstrating the variation in instructor classroom practice for each element of the dimension of apprehension reduction. The black dotted line is the median for the 25 instructors; the red line is the practice of the instructor who reduced student failure rate by 65%; and the blue line is the instructor who reduced failure rate by 41%. Each quartile represents where the observations from 25% of the instructors fall. Quartiles can appear to be missing if they overlap with one another.