| Literature DB >> 23222832 |
Cara Gormally1, Peggy Brickman, Mary Lutz.
Abstract
Life sciences faculty agree that developing scientific literacy is an integral part of undergraduate education and report that they teach these skills. However, few measures of scientific literacy are available to assess students' proficiency in using scientific literacy skills to solve scenarios in and beyond the undergraduate biology classroom. In this paper, we describe the development, validation, and testing of the Test of Scientific Literacy Skills (TOSLS) in five general education biology classes at three undergraduate institutions. The test measures skills related to major aspects of scientific literacy: recognizing and analyzing the use of methods of inquiry that lead to scientific knowledge and the ability to organize, analyze, and interpret quantitative data and scientific information. Measures of validity included correspondence between items and scientific literacy goals of the National Research Council and Project 2061, findings from a survey of biology faculty, expert biology educator reviews, student interviews, and statistical analyses. Classroom testing contexts varied both in terms of student demographics and pedagogical approaches. We propose that biology instructors can use the TOSLS to evaluate their students' proficiencies in using scientific literacy skills and to document the impacts of curricular reform on students' scientific literacy.Entities:
Mesh:
Year: 2012 PMID: 23222832 PMCID: PMC3516792 DOI: 10.1187/cbe.12-03-0026
Source DB: PubMed Journal: CBE Life Sci Educ ISSN: 1931-7913 Impact factor: 3.325
Overview of TOSLS development process
| 1 | Examined literature on existing instruments to identify scientific literacy skills |
| 2 | Conducted faculty survey to articulate what encompasses scientific literacy skills |
| 3 | Developed and administered a pilot assessment based on defined skills |
| 4 | Revised assessment based on item analyses and feedback from student interviews |
| 5 | Examined instrument validity through additional student interviews and biology faculty evaluations |
| 6 | Evaluated finalized instrument for item difficulty, item discrimination, and reliability |
| 7 | Administered instrument in multiple contexts to demonstrate utility and measured learning gains |
Categories of scientific literacy skills
| Questions | Explanation of skill | Examples of common student challenges and misconceptions | |
|---|---|---|---|
| I. Understand methods of inquiry that lead to scientific knowledge | |||
| 1. Identify a valid scientific argument | 1, 8, 11 | Recognize what qualifies as scientific evidence and when scientific evidence supports a hypothesis | Inability to link claims correctly with evidence and lack of scrutiny about evidence |
| “Facts” or even unrelated evidence considered to be support for scientific arguments | |||
| 2. Evaluate the validity of sources | 10, 12, 17, 22, 26 | Distinguish between types of sources; identify bias, authority, and reliability | Inability to identify accuracy and credibility issues |
| 3. Evaluate the use and misuse of scientific information | 5, 9, 27 | Recognize a valid and ethical scientific course of action and identify appropriate use of science by government, industry, and media that is free of bias and economic, and political pressure to make societal decisions | Prevailing political beliefs can dictate how scientific findings are used. All sides of a controversy should be given equal weight regardless of their validity. |
| 4. Understand elements of research design and how they impact scientific findings/conclusions | 4, 13, 14 | Identify strengths and weaknesses in research design related to bias, sample size, randomization, and experimental control | Misunderstanding randomization contextualized in a particular study design. General lack of understanding of elements of good research design. |
| II. Organize, analyze, and interpret quantitative data and scientific information | |||
| 5. Create graphical representations of data | 15 | Identify the appropriate format for the graphical representation of data given particular type of data | Scatter plots show differences between groups. Scatter plots are best for representing means, because the graph shows the entire range of data. |
| 6. Read and interpret graphical representations of data | 2, 6, 7, 18 | Interpret data presented graphically to make a conclusion about study findings | Difficulty in interpreting graphs Inability to match patterns of growth, (e.g., linear or exponential) with graph shape |
| 7. Solve problems using quantitative skills, including probability and statistics | 16, 20, 23 | Calculate probabilities, percentages, and frequencies to draw a conclusion | Guessing the correct answer without being able to explain basic math calculations Statements indicative of low self-efficacy: “I'm not good at math.” |
| 8. Understand and interpret basic statistics | 3, 19, 24 | Understand the need for statistics to quantify uncertainty in data | Lack of familiarity with function of statistics and with scientific uncertainty. Statistics prove data is correct or true. |
| 9. Justify inferences, predictions, and conclusions based on quantitative data | 21, 25, 28 | Interpret data and critique experimental designs to evaluate hypotheses and recognize flaws in arguments | Tendency to misinterpret or ignore graphical data when developing a hypothesis or evaluating an argument |
Figure 1.Percentage of faculty who rated these skills (described in Table 2) as important to very important (4–5 out of a 5-point scale), and percentage who currently teach and assess these skills (n = 167 faculty participants teaching a Gen Ed course).
Example questions contextualized around real-world issues
Summary of expert responses to the three queries about the 28 TOSLS questions
| Agreement of experts ( | |||
|---|---|---|---|
| Subject of query | >90% | >80% | >70% |
| Number of questions ( | |||
| The information given in this question is scientifically accurate. | 23 | 5 | 0 |
| The question is written clearly and precisely. | 15 | 8 | 5 |
| After taking a college Gen Ed science course, students should be able to answer this question. | 19 | 7 | 2 |
Mean pre- and posttest scores of students from each course with calculated t value and effect size, as well as scores from biology faculty expertsa
| Mean % correct (SE) | Internal consistency | |||||
|---|---|---|---|---|---|---|
| Pretest | Posttest | Effect size | Pretest | Posttest | ||
| Project-based nonmajors at public research university | 61.71 (1.05) | 70.76 (0.96) | 10.51* | 0.83 | 0.734 | 0.758 |
| Traditional nonmajors at public research university | 58.33 (0.99) | 65.45 (0.92) | 9.65* | 0.48 | 0.718 | 0.713 |
| Private research university | 84.63 (1.30) | 84.95 (1.34) | 0.32 | 0.03 | 0.581 | 0.632 |
| Midsized state college | 44.29 (1.70) | 42.50 (1.56) | 1.22 | 0.12 | N/A | N/A |
| Biology majors at public research university | 61.72 (0.71) | 67.13 (0.75) | 7.65* | 0.33 | 0.682 | 0.761 |
| Biology experts | N/A | 91.43 (0.98) | N/A | N/A | N/A | |
aPre- and posttest internal consistency is shown.
b*p < 0.05 (indicates significant gains).
Figure 2.(a) Pre- and postmeasures of item difficulty, with results from the nonscience majors in the lecture-based section and (b) the project-based section of Concepts in Biology in Fall 2011 (* p < 0.05 difference between pre- and posttest scores). Questions are grouped according to skills (Table 2).
Figure 3.Pre- and postmeasures of item discrimination from Fall 2011. Findings from lecture-based and projects-based sections are shown combined.
Demographics of courses from each institution
| Public research university nonmajors | |||||
|---|---|---|---|---|---|
| Project-based | Traditional | Public research university majors | Private research university nonmajors | Midsized state college nonmajors | |
| 290 | 296 | 544 | 50 | 80 | |
| Male (% of sample) | 38.3 | 26.4 | 40.1 | 32 | 28.78% |
| GPA | 3.49 (0.472) | 3.53 (0.415) | 3.27 (0.452) | 3.62 (0.42) | Not reported |
| Major (% of sample) | |||||
| Social sciences | 29.6 | 29.4 | 3.5 | 36 | 15 |
| Humanities | 12.0 | 10.8 | 1.2 | 12 | 3.8 |
| Sciences | 16.2 | 16.6 | 78.1 | 40 | 21.3 |
| Mathematics | 3.1 | 3.0 | 0.3 | 0 | 0 |
| Business | 24.1 | 19.9 | 0.8 | 0 | 12.5 |
| Journalism | 6.2 | 10.6 | 0.2 | 0 | 0 |
| Education | 6.5 | 9.5 | 1.0 | 2 | 17.5 |
| Agriculture | 2.4 | 0 | 4.6 | 0 | 3.8 |
| Engineering | 0 | 0 | 0 | 2 | 0 |
| Undecided/not reported | 0 | 0 | 0 | 0 | 6.25 |
| Number of college-level science courses completed (average) | 1.46 (1.49) | 1.12 (1.29) | 2.43 (1.44) | 2.48 (1.86) | 0.53 (0.98) |
Figure 4.Estimated marginal mean learning gains for each course, controlling for pretest scores. Letters indicate significantly different learning gains among courses (p < 0.05).