| Literature DB >> 32108558 |
Brie Tripp1, Sophia A Voronoff1, Erin E Shortlidge1.
Abstract
A desired outcome of education reform efforts is for undergraduates to effectively integrate knowledge across disciplines in order to evaluate and address real-world issues. Yet there are few assessments designed to measure if and how students think interdisciplinarily. Here, a sample of science faculty were surveyed to understand how they currently assess students' interdisciplinary science understanding. Results indicate that individual writing-intensive activities are the most frequently used assessment type (69%). To understand how writing assignments can accurately assess students' ability to think interdisciplinarily, we used a preexisting rubric, designed to measure social science students' interdisciplinary understanding, to assess writing assignments from 71 undergraduate science students. Semistructured interviews were conducted with 25 of those students to explore similarities and differences between assignment scores and verbal understanding of interdisciplinary science. Results suggest that certain constructs of the instrument did not fully capture this competency for our population, but instead, an interdisciplinary framework may be a better model to guide assessment development of interdisciplinary science. These data suggest that a new instrument designed through the lens of this model could more accurately characterize interdisciplinary science understanding for undergraduate students.Entities:
Mesh:
Year: 2020 PMID: 32108558 PMCID: PMC8697648 DOI: 10.1187/cbe.19-09-0168
Source DB: PubMed Journal: CBE Life Sci Educ ISSN: 1931-7913 Impact factor: 3.325
Shortened rubric provided to studentsa
| Rubric elements | Criteria | Guiding questions |
|---|---|---|
| Purposefulness | 1.1 | Is there a clearly stated purpose that calls for an integrative approach and a clear rationale or justification for taking this approach? |
| 1.2 | Does the paper use the writing genre effectively to communicate with its intended audience? | |
| Disciplinary grounding | 2.1 | Does the paper use disciplinary knowledge accurately and effectively (e.g., concepts, perspectives, findings, examples, relevant and credible sources)? |
| 2.2 | Does the paper use disciplinary methods accurately and effectively (e.g., experimental design)? | |
| Integration | 3.1 | Does the paper include selected disciplinary perspectives and insights from two or more disciplinary traditions presented in the course or from elsewhere that are relevant to the paper’s purpose? |
| 3.2b | Is there an integrative device or strategy (i.e., metaphor or analogy)? | |
| 3.3 | Is there a sense of balance in the overall composition of the piece with regard to how disciplinary perspectives are brought together to advance the purpose of the piece? | |
| 3.4 | Do the conclusions drawn by the paper indicate that understanding has been advanced by the integration of disciplinary views (e.g., the paper takes full advantage of the opportunities presented by the integration of disciplinary insights to advance its intended purpose both effectively and efficiently; integration may result in novel or unexpected insights)? | |
| Critical awareness | 4.1c | Does the paper exhibit awareness of the limitations and benefits of the contributing disciplines? |
| 4.2c | Does the paper exhibit self-reflection (e.g., metacognition)? |
aAdapted from Boix Mansilla .
bExcluded from scoring.
cMerged.
Coding rubric for survey question “Please explain how you assess learning outcomes related to students’ understanding of interdisciplinary science” (n = 68)
| Themes | Examples | Participants % ( |
|---|---|---|
| 1. Writing activities | 69 (47) | |
| a. Writing assignments | Essays/papers | 51 (35) |
| b. Self-reflection | Journals | 6 (4) |
| Reflection assignments | ||
| 2. Traditional | Unspecified as individual or group | 34 (23) |
| Exams | ||
| Quizzes | ||
| Homework assignments | ||
| 3. Group work | Two or more students | 34 (23) |
| Communication/discussion | ||
| Group research/projects | ||
| Problem-based learning | ||
| Group presentation |
aPercentages are greater than 100% due to responses being coded into multiple themes.
Course characterization of four upper-division natural and physical science courses
| Course | Format | Credits | Total no. of essay participants | Total no. of interview participants | Disciplinary or ID; Course-listed departments | Instructors |
|---|---|---|---|---|---|---|
| Biochemical Virology | Lecture | 1 | 11 | 4 | ID; | 1 Biochemist |
| Chemical Ecology | Lecture + research-based lab | 3 | 13 | 8 | ID; | 1 Chemist |
| Environmental Restoration | Lecture | 3 | 32 | 6 | ID; | 1 Ecologist |
| Plant Systematics | Lecture + traditional lab | 4 | 15 | 7 | Disciplinary; | 1 Biologist |
FIGURE 1.Box plots compare student overall mean construct scores (n = 71). Nonidentical letters above bars represent significant (p < 0.05) differences among construct scores (as determined by ANOVA and post hoc pairwise comparisons using Tukey’s HSD). A one-way Welch’s ANOVA detected a significant difference between mean construct scores (F(3, 280) = 6.149, p = 0.00057, η2 = 0.062). Tukey’s post hoc analyses reveal that students scored significantly higher on purposefulness than integration and critical awareness (p = 0.0025 and p = 0.0139, respectively), with no significant differences between the latter two constructs. Students performed significantly better on disciplinary grounding than integration (p = 0.0185), with no significant differences between disciplinary grounding and purposefulness. Box: 25th to 75th percentile; bars: minimum and maximum values. The error bars represent the standard error of the mean.
FIGURE 2.Box plots compare students’ mean essay scores across four upper-division courses (n = 71). Nonidentical letters above bars represent significant (p < 0.05) differences among courses (as determined by ANOVA and post hoc pairwise comparisons using Tukey’s HSD). One-way ANOVA revealed a significant difference between mean construct scores (F(3, 67) = 3.691, p = 0.016, η2 = 0.142). A Tukey’s post hoc test indicated a significant difference in mean essay scores between Chemical Ecology and Environmental Ecology (p = 0.0187), with no significant differences between other courses. Box: 25th to 75th percentile; bars: minimum and maximum values.
FIGURE 3.Comparison of mean construct scores for students enrolled in four courses (n = 71). Nonidentical letters above bars represent significant (p < 0.05) differences among courses within each construct (as determined by ANOVA and post hoc pairwise comparisons using Tukey’s HSD). One-way ANOVA indicated a significant difference between course scores based on the constructs disciplinary grounding (F(3, 68) = 14.5, p < 0.0001, η2 = 0.329), integration (F(3, 68) = 19.2, p < 0.0001, η2 = 0.401), and critical awareness (F(3, 68) = 8.38, p = 0.0003, η2 = 0.187; Welch’s ANOVA for unequal variances reported based on significant Levene’s test for integration and critical awareness). Tukey’s post hoc tests: (A) construct purposefulness: no significant differences in student scores across courses; (B) construct disciplinary grounding: students in Chemical Ecology, Biochemical Virology, and Plant Systematics score significantly higher than students in Environmental Restoration (p < 0.0001, p = 0.0024, and p = 0.0435, respectively); (C) construct integration: students enrolled in Biochemical Virology and Chemical Ecology significantly outperformed students in Plant Systematics (p = 0.0207 and p = 0.0138, respectively) and in Environmental Restoration (p < 0.0001 for both courses); (D) construct critical awareness: students in Chemical Ecology and Environmental Restoration scored significantly higher than students in Plant Systematics (p = 0.006 and p = 0.016, respectively). The error bars represent the standard error of the mean.
FIGURE 4.Numeric construct scores, (1) naïve, (2) novice, (3) apprentice, and (4) mastery, matched with same-student binary interview score (yes, no). (A) Disciplinary grounding, (B) integration, and (C) critical awareness. Bubble size corresponds to the number of students who obtained a given construct and interview score (i.e., larger bubbles indicate a greater number of students who received a particular matched score).
Examples of matched and mismatched understanding of ID from same-student essay and interview responses
| Construct | Essay responses | Interview quotes |
|---|---|---|
|
|
| |
| “The unknown plant bears fruits that appear healthy and edible, but without analysis of their nutritional content nothing can be said for certain. We intend on determining the mineral content of the fruit using near-infrared reflectance spectroscopy, as well as measuring secondary-metabolites to deter herbivory. Assessing floral morphology will provide insight into its pollination syndrome, and, consequently, its method of pollination.” | “I think about how plants use compounds, there’s all sorts of ecological relationships between plants, and different organisms, and pollinators, and the idea of plants producing nectar has a lot to do with chemistry. Then plants producing all sorts of volatile compounds that attract predatory organisms for defences.” | |
|
|
| |
|
| ||
| “The morphological character of the flower also does not indicate bee pollination. The inflorescence consists of a single yellow-orange tubular corolla with a deep nectar reserve, which suggests pollination by | “We talked about compounds and secondary compounds of plants. There’s even, when you go down to systematics you’re talking about how things are related. To find out how things are related you look at the DNA of plants the molecular level through DNA sequencing and GenBank as well as they work morphologically.” | |
|
|
| |
|
|
| |
| “We will perform a phylogenetic analysis using microsatellites to find out what species of fruit or vegetable this plant is most closely related to. We will use microsatellites since this new species must have recently diverged from an extant crop plant species. We can then contact chemists to analyze the chemical compounds present and correlate this with related species from the phytogenic analysis.” | “It’s important to know how things are actually working, requiring the knowledge of chemistry and viewing biological systems in a chemistry sort of lens. Learning about geology and chemistry would really help in phylogenetic projects, just because understanding the history of the earth and the geography can help us interpret trends in the genotypes of organisms. The moulding of these knowledge sets ends in a greater understanding of plants holistically.” | |
|
|
| |
|
| ||
| “How the park will be restored mostly comes down to the project goals. This is a public park after all […] not a far out wilderness ecosystem. So, what does the public want?” | “[Environmental restoration] means using systems science and science of cycles in biogeochemistry. It’s trying to bring back a previous state using history to look back at reference sites. Restoration requires collaborating between experts, having a more well-rounded view, because you’re bring[ing] in hydrologist to geologist, a biologist, a chemist. You’re thinking about all the different aspects of something instead of being one sided.” | |
|
|
| |
|
|
| |
| “If the species is determined to be a self-pollinator and we determined the origin of its evolution through genetic sequencing there is a possibility that we could use cross pollination. However, as many self-pollinators use wind or rain as transportation modes for pollen, this could ultimately lead to an uncontrolled spread of the plants’ genes to other species, thus having a negative effect [on] the ecosystem. Alternatively, we could assess pollination through the measurement of volatile organic compounds. If all else fails, I would reassess my methodological approach.” | “I like the, ‘it may or may not happen this way’, in biology. I love going out into nature and [wondering], ‘Why is it that way?’ It is very important to set it up beforehand, like my bee pollination experimental design, and map it out and it may not go as planned. A big part of science is just recognizing why you failed or how you can do things better the next time around. Why didn’t they pollinate? Why did the plants not sprout? Why did we not get the results that we wanted? You need to go need back and check your experimental process!” | |
|
|
| |
|
| ||
| “We can live in a better world, and this better world must inherently include all people on the planet earth. By providing a sustainable, high nutrient food source, we can [achieve] this dream thereby halting human starvation.” | “Learning about how to deal with experiments not turning out how you want them to turn out—what’s possibly good data when addressing the behemoth issue of food insecurity. Learning to take a step back—which variable or parameters are we going to change here to make this still useful, even though it didn’t turn out how we wanted it to turn out.” | |
|
|
| |