| Literature DB >> 34938443 |
Erin E Shortlidge1, Alison Jolley2, Stephanie Shaulskiy3, Emily Geraghty Ward4, Christopher N Lorentz5, Kari O'Connell6.
Abstract
Undergraduate field experiences (UFEs) are a prominent element of science education across many disciplines; however, empirical data regarding the outcomes are often limited. UFEs are unique in that they typically take place in a field setting, are often interdisciplinary, and include diverse students. UFEs range from courses, to field trips, to residential research experiences, and thereby have the potential to yield a plethora of outcomes for undergraduate participants. The UFE community has expressed interest in better understanding how to assess the outcomes of UFEs. In response, we developed a guide for practitioners to use when assessing their UFE that promotes an evidence-based, systematic, iterative approach. This essay guides practitioners through the steps of: identifying intended UFE outcomes, considering contextual factors, determining an assessment approach, and using the information gained to inform next steps. We provide a table of common learning outcomes with aligned assessment tools, and vignettes to illustrate using the assessment guide. We aim to support comprehensive, informed assessment of UFEs, thus leading to more inclusive and reflective UFE design, and ultimately improved student outcomes. We urge practitioners to move toward evidence-based advocacy for continued support of UFEs.Entities:
Keywords: assessment; field experiences; inclusion; learning outcomes; undergraduates
Year: 2021 PMID: 34938443 PMCID: PMC8668733 DOI: 10.1002/ece3.8241
Source DB: PubMed Journal: Ecol Evol ISSN: 2045-7758 Impact factor: 2.912
FIGURE 1Guide for Assessing Undergraduate Field Experiences (UFEs). The figure presents a guide to walk practitioners through assessing their UFE. The green arrows signify that each box informs the other, and iterative reflection and refinement are a key aspect of informed evaluation and assessment
FIGURE 2Vignettes of Undergraduate Field Experiences (UFEs). These vignettes (a–d) represent actual examples of UFEs and illustrate how to apply the components of Figure 1 (Strategy for Assessment of Undergraduate Field Experiences (UFEs)) to assess each UFE. Figure 2d was based on (Feig et al., 2019; Gilley et al., 2015; Stokes et al., 2019)
Intended student outcomes and aligned assessment tool examples
| Primary aim | Example student outcomes | Example assessment tools for measuring aim | Measurement details (# of items, item type, time to administer) | Population(s) tested | Ease of analysis | Original reference |
|---|---|---|---|---|---|---|
| Broader Relevance ‐ |
Increased sense of connection to local/community problems or issues Increased sense of connection to large‐scale problems or issues Development as informed citizens | Perceived Cohesion Scale (PCS) | 6 items, Likert | Multiple ages & populations | Easy | Bollen, K. A., and R. H. Hoyle. 1990. Perceived cohesion: a conceptual and empirical examination. Soc. Forces 69(2):479–504. |
| Connection to Place ‐ |
Increased stewardship intention or behaviors Increased respect or care for the environment Stronger connections to place | Environmental Orientations (ECO) | 16 items, Likert | Ages 6 – 13 | Easy | Larson, L. R., Green, G. T., & Castleberry, S. B. (2011). Construction and Validation of an Instrument to Measure Environmental Orientations in a Diverse Group of Children. Environment and Behavior, 43(1), 72–89. |
| Environmental Attitudes Inventory (EAI) | 24 or 72 items, Likert | Multiple ages & populations | Easy | Milfont, T.L., and J. Duckitt. (2010). The environmental attitudes inventory: a valid and reliable measure to assess the structure of environmental attitudes. J. Envrion. Psychol. 30: 80–94. | ||
| Place Attachment Inventory (PAI) | 15 items, Likert | Multiple ages & populations | Easy | Williams, D.R., & Vaske, J.J. 2003, The measurement of place attachment: validity and generalizability of a psychometric approach: Forest Science, v. 49, p. 830–840. | ||
| Place Meaning Questionnaire (PMQ) | 30 items, Likert | Multiple ages & populations | Easy | Young, M., 1999, The social construction of tourist places: Australian Geographer, v. 30, p. 373–389, | ||
| Place Meaning Scale‐Marine Environments (PMS‐ME) | 34 items, Likert | Tourist industry representatives; resource managers; and recreational visitors | Easy | Wynveen, C. J., & Kyle, G. T. (2015). A place meaning scale for tropical marine settings. Environmental management, 55(1), 128–142. | ||
| New Ecological Paradigm Scale (NEP) | 15 items, Likert | Multiple ages & populations | Easy | Dunlap, R., K. Liere, A. Mertig, and R.E. Jones. 2000. Measuring endorsement of the new ecological paradigm: a revised NEP scale. J. Soc. Iss. 56: 425–442. | ||
| Nature of Science ‐ |
Increased awareness of scientific ethics Stronger sense of what life as a scientist is like Increased knowledge of the nature of science Increased proficiency in general research practices | Colorado learning attitudes about science survey ‐ biology (CLASS‐Bio) | 31 items, Likert | Undergraduate students (University of Colorado and University of British Columbia) | Moderate | Semsar, K., Knight, J.K., Birol, G., and Smith, M.K. (2011). The Colorado Learning Attitudes about Science Survey (CLASS) for use in biology. CBE—Life Sciences Education, 10, 268–278. |
| Views on the Nature of Science (VNOS‐C) | Open‐ended, 45–60 min | Multiple ages & populations | Hard (requires inter‐rater review of answers) | Lederman, N. G., F. Abd‐El‐Khalick, R. L. Bell, and R. S. Schwartz. 2002. Views of nature of science questionnaire: toward valid and meaningful assessment of learners’ conceptions of nature of science. J. Res. Sci. Teach. 39:497–521. | ||
| Biological Experimental Design Concept Inventory (BEDCI) | 14 items, multiple choice, 18 min | Undergraduate students (University of British Columbia) | Easy | Deane, T., K. Nomme, E. Jeffery, C. Pollock, and G. Birol. 2014. Development of the biological experimental design concept inventory (BEDCI). CBE–Life Sci. Educ. 13:540–551. | ||
| Expanded Experimental Design Ability Test (E‐EDAT) | Open‐ended | Undergraduate students (University of Washington) | Moderate (Rubric) | S. E. Brownell, M.P. Wenderoth, R. Theobald, N. Okoroafor, M. Koval, S. Freeman, C. L. Walcher‐Chevillet, A.J. Crowe, How Students Think about Experimental Design: Novel Conceptions Revealed by in‐Class Activities, BioScience, Volume 64, Issue 2, February 2014, Pages 125–137, | ||
| Experimental Design Ability Test (EDAT) | Open‐ended, 10–12 minutes | Undergraduate students, Introductory class (Bowling Green State) | Moderate (Rubric) | Sirum, K., and J. Humburg. 2011. The experimental design ability test (EDAT). Bioscene J. Coll. Biol. Teach. 37:8–16 | ||
| The Rubric for Science Writing | Open ended | Undergraduates students and Graduate teaching assistants (University of Southern California) | Moderate (Rubric) | Timmerman, B. E C., D. C. Strickland, R.L. Johnson, and J. R. Payne. 2011. Development of a ‘universal’ rubric for assessing undergraduates’ scientific reasoning skills using scientific writing. Assess. Eval. Higher Educ. 36:509–547. | ||
| Test of Scientific Literacy Skills (TOSLS) | Multiple Choice, 30 min | Multiple populations | Easy | Gormally, C., P. Brickman, and M. Lutz. 2012. Developing a test of scientific literacy skills (TOSLS): measuring undergraduates’ evaluation of scientific information and arguments. CBE–Life Sci. Educ. 11:364–377. | ||
| Student perceptions about earth science survey (SPESS) | 29 items, Likert | Undergraduate students in earth and ocean sciences (University of British Columbia) | Moderate | Jolley, A., Lane, E., Kennedy, B., and Frappé‐Sénéclauze, T. 2012. SPESS: a new instrument for measuring student perceptions in earth and ocean science. Journal of Geoscience Education, 60(1):83–91. | ||
| Entering Research Learning Assessment (ERLA) | 53 items, with 47 item optional paired assessment for mentors to assess trainee gains | Multiple populations of undergraduate and graduate trainees | Moderate (scoring guide) | Butz, A. R., & Branchaw, J. L. (2020). Entering Research Learning Assessment (ERLA): Validity evidence for an instrument to measure undergraduate and graduate research trainee development. CBE – Life Sciences Education, 19(2) | ||
| Views about Science Survey (VASS) | 30 items, Likert | 8th‐undergraduate students | Easy | Halloun, Ibrahim. (2001). Student Views about Science: A Comparative Survey. Beirut: Phoenix Series/Educational Research Center, Lebanese University. | ||
| Personal Gains ‐ |
Ability to live and work in primitive or adverse camping conditions Development of or increased “Grit” (perseverance through tough situation) Increased content knowledge Increased interest in the topic of field course More refined career goals Improved discipline‐specific skills Development of outdoor skills Increased confidence in physical fitness | Grit Scale (GRIT) | 8 or 12 items, Likert | Multiple populations | Easy | Duckworth, A. L., Peterson, C., Matthews, M. D., & Kelly, D. R. (2007). Grit: Perseverance and passion for long‐term goals. Journal of Personality and Social Psychology, 92(6), 1087–1101. |
| Climate change concept inventory | 21 items, Likert | Undergraduate students | Easy | Libarkin, J. C., Gold, A. U., Harris, S. E., McNeal, K. S., & Bowles, R. P. (2018). A new, valid measure of climate change understanding: associations with risk perception. Climatic Change, 150(3–4), 403–416. | ||
| Geoscience concept inventory (GCI) | select 15 question subset from 73 total questions, Multiple choice | Undergraduate students | Easy | Libarkin, J.C., Anderson, S.W., (2006). The Geoscience Concept Inventory: Application of Rasch Analysis to Concept Inventory Development in Higher Education: in Applications of Rasch Measurement in Science Education, ed. X. Liu and W. Boone: JAM Publishers, p. 45–73 | ||
| National Survey of Student Engagement (NSSE)* | 70 items, Likert | Multiple populations | Easy | Kuh, G. D. 2009. The national survey of student engagement: conceptual and empirical foundations. New Direct. Inst. Res. 2009:5–20. | ||
| Landscape identification and formation timescales (LIFT) | 12 items, Multiple choice | Undergraduate students in earth and ocean sciences (University of British Columbia) | Easy | Jolley, A., Jones, F., and Harris, S. 2013. Measuring student knowledge of landscapes and their formation timespans. Journal of Geoscience Education, 61(2):240–251. | ||
| Psychological Sense of School Membership (Class Belonging/School Belonging) | 18 items, Likert | Middle school and undergraduate students | Easy | Goodenow, C. (1993). The psychological sense of school membership among adolescents: Scale development and educational correlates. Psychology in the Schools, 30, 79–90. | ||
| Personal Connections to Science Context ‐ |
Greater sense of belonging in the scientific community Increased value for the interdisciplinary nature of science Increased interest in a general science career Increased interest in a field‐based science career Increased scientific self‐efficacy | Common Instrument Suite (CIS)* | 10 items, Likert | Grades 4 and above | Easy |
|
| Motivated strategies for learning questionnaire (MSLQ) | 81 statements, Likert | Easy | Pintrich, R. R., & DeGroot, E. V. (1990). Motivational and self‐regulated learning components of classroom academic performance, Journal of Educational Psychology, 82, 33–40. | |||
| Science Interest Survey (SIS) | 21 items, Likert | Middle and high school grade children from varying ethnic backgrounds | Easy | Lamb, R.L., Annetta, L., Meldrum, J. et al. MEASURING SCIENCE INTEREST: RASCH VALIDATION OF THE SCIENCE INTEREST SURVEY. Int J of Sci and Math Educ 10, 643–668 (2012). | ||
| Career Decision Making Survey ‐ Self Authorship (CDMS‐SA) | 18 items, Likert | Multiple populations | Easy | Creamer, E. G., M. B. Magolda, and J. Yue. 2010. Preliminary evidence of the reliability and validity of a quantitative measure of self‐authorship. J. Coll. Student Devt. 51:550–562 | ||
| Research on the Integrated Science Curriculum (RISC) | Likert, adaptable | Easy |
| |||
| Student Assessment of Learning Gains (SALG) | 5 item, Likert | College students (CSU‐Fullerton) | Easy | Student Perspectives on Curricular Change: Lessons from an Undergraduate Lower‐Division Biology Core Merri Lynn Casem. CBE—Life Sciences Education 2006 5:1, 65–75 | ||
| Science Motivation Questionnaire II (SMQII) | 25 item, Likert | College students (University of Georgia) | Easy | Glynn, S. M., P. Brickman, N. Armstrong, and G. Taasoobshirazi. 2011. Science motivation questionnaire II: validation with science majors and non‐science majors. J. Res. Sci. Teach. 48:1159–1176. | ||
| Survey of Undergraduate Research Experiences (SURE) | 15 minute, Likert | Easy | Lopatto, D. 2004. Survey of undergraduate research experiences (SURE): first findings. Cell Biol. Educ. 3:270–277. | |||
| Undergraduate Student Self‐Assessment Instrument (URSSA) | Likert, adaptable | Multiple undergraduates ‐ geared toward URE but mostly applicable | Easy | The Undergraduate Research Student Self‐Assessment (URSSA): Validation for Use in Program Evaluation Timothy J. Weston and Sandra L. Laursen CBE—Life Sciences Education 2015 14:3 | ||
| STEM Self‐efficacy (STEM‐SE) | 29 items including demographic questions, Likert | Undergraduate students but with emphasis on historically underrepresented racial/ethnic groups in science majors engaged in research experiences | Easy | Byars‐Winston A, Rogers J, Branchaw J, Pribbenow, Hanke R, Pfund C. (2016). New measures assessing predictors of academic persistence for historically underrepresented racial/ethnic undergraduates in science. CBE–Life Sciences Education, 3ar32. | ||
| STEM Career Interest Survey (STEM‐CIS) | 44 items, Likert | Middle school students (grades 6–8) who primarily were in rural, high‐poverty districts in the southeastern USA | Easy | Kier M, Blanchard M, Osborne J, Albert J. (2014). The development of the STEM career interest survey (STEM‐CIS). Research in Science Education 44:461–481. | ||
| Transferable Skills ‐ |
Improved communication skills Improved collaboration skills Improved problem‐solving skills Improved critical thinking skills | Critical Thinking Assessment Test (CAT)* | 15 items, Open‐ended | Multiple populations | Moderate (scoring guide) | Stein, B., A. Haynes, M. Redding, T. Ennis, and M. Cecil. (2007). Assessing critical thinking in STEM and beyond, p 79–82. In: Innovations in e‐learning, instruction technology, assessment, and engineering education. Springer, Netherlands |
| California Critical Thinking Skills Test (CCTST)* | 45 minutes, Multiple choice | Undergraduate students (CSU‐Fullerton) | Easy | Facione, P. A. 1991. Using the California Critical Thinking Skills Test in Research, Evaluation, and Assessment. [Online.] | ||
| Self‐perceived communication competence (SPCC) | 12 items, Numerical rating on 100 point scale | Undergraduate students | Easy | McCroskey, J.C., & McCroskey, L. L. (1988). Self‐report as an approach to measuring communication competence. Communication Research Reports, 5(2), 108–113. |
The intended student outcomes were first identified from the UFERN landscape study (O’Connell et al., 2020) and by participants at the 2018 UFERN Network Meeting at Kellogg Biological Station, April 30–May 2, 2018. The authors of this essay then refined the list by removing those outcomes that were either duplicated, irrelevant, not measurable, or linked to very specific contexts (not field universal). Each outcome is grouped according to a primary aim defined in the table below. The table organizes published assessment tools that fall under each primary aim category and that are applicable for use in undergraduate field education experiences. This table was designed to help practitioners identify instruments that align with the intended student outcomes they have identified for their field experiences. The primary aims are categories that the authors have defined to link outcomes with assessments using language that is accessible to the practitioner. The aim categories do not necessarily represent specific constructs or scales for individual assessments. The structure of the table follows that designed by Shortlidge and Brownell (2016).