| Literature DB >> 26903497 |
Thomas Deane1, Kathy Nomme2, Erica Jeffery1, Carol Pollock3, Gülnur Birol4.
Abstract
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non-expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI's robustness and sensitivity in capturing useful data relating to the students' conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills.Entities:
Mesh:
Year: 2016 PMID: 26903497 PMCID: PMC4803094 DOI: 10.1187/cbe.15-06-0131
Source DB: PubMed Journal: CBE Life Sci Educ ISSN: 1931-7913 Impact factor: 3.325
Other educational instruments that assess statistical reasoning skills in students assess a range of different constructs
| Instrument/study | Summarya | Concepts mapped |
|---|---|---|
| Statistics Concept Inventory ( | • 25-item multiple-choice instrument | Data summary and presentation, probability, random variables, discrete probability distributions, continuous random variables and probability distributions, joint probability distributions, parameter estimation, linear regression, time series, confidence intervals and hypothesis testing, single-factor experiments and multifactor designs |
| • Focuses on mathematics and statistics (statistical thinking, rather than statistical reasoning) | ||
| Statistical Reasoning Assessment ( | • Five tasks completed in software program | Reasoning about center (mean, mode, median), spread (range, interquartile range, variance, SD), and distribution (combination of center, spread, skewness, density, outliers, causality, chance, sampling) |
| • Aimed at high-school students | ||
| Statistical Reasoning Assessment ( | • 20-item multiple-choice instrument | Interpreting probability, selecting an appropriate average, computing probability (and as a ratio), independence, sampling variability, correlation versus causation, interpreting two-way tables, importance of large samples |
| • Weighted averages based on the sum of correct reasoning and misconceptions (proportion) | ||
| Assessment Resource Tools for Improving Statistical Thinking (ARTIST website: | • 11 scales/topics (each with 7–15 multiple-choice items) | Data collection, data representation, measures of center, measures of spread, normal distribution, probability, bivariate quantitative data, bivariate categorical data, sampling distributions, confidence intervals, significance tests |
| • Administered online | ||
| CAOS 4 ( | • 40-item multiple-choice instrument | Includes data collection and design, descriptive statistics, graphical representations, box plots, normal distribution, bivariate data, probability, sampling variability, confidence intervals, tests of significance |
| • Focuses on concepts students must master after an introductory statistics course | ||
| Statistical Reasoning with Everyday Problems ( | • 10-item open-ended instrument | Probability/chance, law of large numbers, estimation/sample bias, correlation, regression toward the mean |
| • Graders must code student answers based on the reasoning used. | ||
| Verbal-Numerical and Graphical Pilot Study ( | • 11 pairs of open-ended items (one verbal-numerical, one graph per pair) | Reasoning on uncertainty, reasoning on association |
| • Graders must code student answers based on the reasoning used |
aThe summaries indicate how these inventories compare with SRBCI, which features 12 multiple-choice items.
The four core conceptual groupings assessed by SRBCIa
| Core conceptual grouping | SRBCI probes student understanding that: | Question | Scenario |
|---|---|---|---|
| Repeatability of results | 1 | A | |
| 2 | A | ||
| 6 | B | ||
| Variation in data | 3 | A | |
| 9 | C | ||
| 10 | C | ||
| Hypotheses and predictions | 5 | B | |
| 7 | B | ||
| 12 | C | ||
| Sample size | 8 | B | |
| 4 | A | ||
| 11 | C |
aEach of these groupings feature three different questions, which each assess a subtly different related concept, and the groupings are contextualized in >1 experimental scenario (scenario A = salmon; B = squirrels and raccoons; C = sunflowers).
Course descriptions and typical enrollment data for the two courses we sampled (Biology-first-year level and Biology third-year level) in these analyses
| Course name and description | Typical course enrollment |
|---|---|
| ∼1600/yr, 67 lab sections/yr, term 1 and term 2 | |
| ∼100/yr, 4 lab sections and 1 lecture section/yr, term 1 only |
Figure 1.The same linear scale provided by Rasch model analyses can be used to make quantitative assessments of person ability and item difficulty after test data have been collected and fitted to the model. A number of principles apply to such assessments: 1) Items of mean difficulty, and persons with estimated mean ability in the trait being assessed, have difficulty and ability estimates of 0. 2) Items that are more difficult than this mean difficulty and persons who have greater abilities than this mean ability have positive values (>0). 3) The converse is true for items that are easier than the mean difficulty or for persons who have lower abilities than the mean ability (<0). 4) Persons with the same ability estimates as an individual item’s difficulty have a 50% chance of answering that item correctly (e.g., Student A has a 50% chance of answering item 1 correctly, while Student B will have a much lower chance of answering it correctly). 5) The linear scale is set in SDs for ability and difficulty (e.g., Student A has an ability in the trait being assessed that is precisely three SDs greater than that of Student B).
Figure 2.(A) The item-person map derived from Rasch model analysis of student responses to the SRBCI from the Biology-first-year-level population. SRBCI items appear on the x-axis in order of their difficulty (easiest at the top, hardest at the bottom). The frequency distribution of student abilities appears in the bars at the top (y-axis). The peak frequencies appear below the ability estimate of 0, suggesting that most students have relatively low ability in statistical reasoning at this stage of their education. (B) The item-person map derived from Rasch model analysis of student responses to the SRBCI from the Biology-third-year-level population. SRBCI items appear on the x-axis in order of their difficulty (easiest at the top, hardest at the bottom). The frequency distribution of student abilities appears in the bars at the top (y-axis). There are more individuals with abilities above the ability estimate of 0, suggesting that most students have relatively high ability in statistical reasoning at this stage of their education (especially in relation to the Biology first-year-level population). (A and B) Letters after items refer to the conceptual grouping to which these items belong (R, repeatability of results; V, variation in data; H, hypotheses and predictions; S, sample size).
Comparisons between SRBCI raw scores and estimates of student ability provided by Rasch model analyses of data provided by the two student populations assessed in these analyses
| Ability estimate | ||
|---|---|---|
| SRBCI raw score | Biology-first-year level | Biology-third-year level |
| 0 | – | – |
| 1 | −2.83 | −2.82 |
| 2 | −1.94 | −1.91 |
| 3 | −1.33 | −1.29 |
| 4 | −0.84 | −0.80 |
| 5 | −0.40 | −0.36 |
| 6 | −0.02 | 0.04 |
| 7 | 0.43 | 0.44 |
| 8 | 0.87 | 0.86 |
| 9 | 1.35 | 1.32 |
| 10 | 1.93 | 1.88 |
| 11 | 2.79 | 2.72 |
| 12 | – | – |