| Literature DB >> 31144572 |
Julie Dangremond Stanton1, Kathryn Morris Dye2, Me'Shae Johnson1.
Abstract
Metacognitive regulation occurs when learners regulate their thinking in order to learn. We asked how introductory and senior-level biology students compare in their use of the metacognitive regulation skill of evaluation, which is the ability to appraise the effectiveness of an individual learning strategy or an overall study plan. We coded student answers to an exam self-evaluation assignment for evidence of evaluating (n = 315). We found that introductory and senior students demonstrated similar ability to evaluate their individual strategies, but senior students were better at evaluating their overall plans. We examined students' reasoning and found that senior students use knowledge of how people learn to evaluate effective strategies, whereas introductory students consider how well a strategy aligns with the exam to determine its effectiveness. Senior students consider modifying their use of a strategy to improve its effectiveness, whereas introductory students abandon strategies they evaluate as ineffective. Both groups use performance to evaluate their plans, and some students use their feelings as a proxy for metacognition. These data reveal differences between introductory and senior students, which suggest ways metacognition might develop over time. We contextualize these results using research from cognitive science, and we consider how learning contexts can affect students' metacognition.Entities:
Mesh:
Year: 2019 PMID: 31144572 PMCID: PMC6755210 DOI: 10.1187/cbe.18-12-0239
Source DB: PubMed Journal: CBE Life Sci Educ ISSN: 1931-7913 Impact factor: 3.325
Evaluating individual strategies: What worked wella
| Evaluating code | Percentage of introductory biology students | Percentage of senior-level biology students | Example student response and content analysis notes |
|---|---|---|---|
| Sufficient evidence | 63.1(89/141) | 58.0(101/174) | Strategy that worked well: Writing down everything in their own words on blank paper (without resources)Why this worked well: “I forced myself to write things down to make sure I knew them. It’s easy to think you know something without actually knowing it, so writing helps.”Note: The student identifies a strategy that worked well and explains how the strategy helped with learning by allowing the student to monitor understanding of what he or she did and did not know. |
| Partial evidence | 33.3(47/141) | 34.5(60/174) | Strategy that worked well: Watching videos about course conceptsWhy this worked well: “The videos were very helpful and a great source. I prefer the videos to the other sources. Videos go more slowly and explain in simpler terms. It’s also visually appealing.”Note: The student identifies a strategy that works well and writes about a preference for videos because of their pace, accessibility, and visual appeal, but does not elaborate on how videos help with learning. |
| Insufficient evidence | 3.5(5/141) | 7.5(13/174) | Strategy that worked well: (not applicable—student does not select any strategies that worked well, reports that all strategies worked well)Why they worked well: “I got a better understanding of broad information.”Note: The student does not identify any specific strategies that worked well and gives a general explanation for why all strategies worked well. |
aWe asked introductory biology (n = 141) and senior-level biology students (n = 174) “Which study strategies (from your list above) worked well for you?” and “Why did these study strategies work well for you?” Using content analysis, we coded students’ answers as providing sufficient, partial, or insufficient evidence of evaluating (see Methods). The percentage and number of students in each category are shown. We performed a chi-square test of independence to determine whether there were differences in the amount of evidence introductory and senior students provided (p = 0.29, df = 2).
Evaluating individual strategies: What did not work wella
| Evaluating code | Percentage of introductory biology students | Percentage of senior-level biology students | Example student response and content analysis notes |
|---|---|---|---|
| Sufficient evidence | 48.9(69/141) | 49.4(86/174) | Strategy that did not work well: Typing class notesWhy this did not work well: “I don’t really think through the information I type. I don’t think I was really absorbing the information, and I was unable to recall the information on the test.”Note: The student identifies a strategy that did not work well and writes about how typing notes is passive and does not require the student to think about the material in a way that aided retention or learning. |
| Partial evidence | 36.1(51/141) | 34.5(60/174) | Strategy that did not work well: Reading through notes from classWhy this did not work well: “I can’t read my own handwriting and I didn’t take complete notes sometimes.”Note: The student identifies a strategy that did not work well and writes about the quality of the notes, but does not elaborate on how this affected learning. |
| Insufficient evidence | 14.9(21/141) | 16.1(28/174) | Strategy that did not work well: (not applicable—student does not select any strategies that did not work well)Why this did not work well: “I should have studied more with others than by myself. I think I could have benefited more from talking through topics with others rather than dwelling on subjects I could not figure out by myself.”Note: The student does not identify a strategy that did not work well and writes about what he or she should have done rather than what he or she did. The student is reflecting, but not evaluating the ineffectiveness of the strategy. |
aWe asked introductory biology (n = 141) and senior-level biology students (n = 174) “Which study strategies (from your list above) did not work well for you?” and “Why didn’t these study strategies work well for you?” We coded students’ answers as providing sufficient, partial, or insufficient evidence of evaluating using content analysis. We performed a chi-square test of independence to determine whether there were differences between the two groups (p = 0.93, df = 2).
Evaluating overall study plansa
| Evaluating code | Percentage of introductory biology students | Percentage of senior-level biology students | Example student response and content analysis notes |
|---|---|---|---|
| Sufficient evidence | 17.7 (25/141) | 32.2(56/174) | Student’s evaluation: “My plan was somewhat effective.”Why this did not work well: “I understood the details of and reasons for proteins in each pathway. However, my plan was not very time efficient and I was unable to see the big picture. It was also difficult to ‘trace’ pathways from beginning to end because of this lack of ‘big picture’ understanding.”Note: The student appraises the study plan in three ways (using personal insights), explaining that the plan allowed him or her to learn detailed information, was not efficient and did not include approaches for seeing the major themes, and affected the learning of whole pathways. |
| Partial evidence | 52.5(74/141) | 50.6(88/174) | Student’s evaluation: “My plan was moderately effective.”Why this did not work well: “I did alright [on the exam], but definitely could have done better by studying more efficiently.”Note: The student appraises the study plan in two ways, writing about exam performance (outside information) and the efficiency of the study plan (personal insight). Yet the student does not explain why the study plan was not efficient or how this affected his or her learning. |
| Insufficient evidence | 29.8(42/141) | 17.2(30/174) | Student’s evaluation: “My plan was relatively effective.”Explanation: “I did better on the exam than I expected to compared to the average.”Note: The student appraises the study plan solely on performance (outside information) and does not offer personal insights on the plan’s effectiveness. |
aTo examine evaluation of overall study plans, we asked introductory biology (n = 141) and senior-level biology students (n = 174) “How effective was your study plan for exam one? Please explain your answer.” We coded students’ answers as providing sufficient, partial, or insufficient evidence of evaluating using content analysis (see Methods). The p value from our chi-square test of independence was <0.01 (p = 0.0028, df = 2), indicating there are differences in the amount of evidence introductory and senior students provided.