| Literature DB >> 32699155 |
Noam Angrist1,2, Peter Bergman3, David K Evans4, Susannah Hares5, Matthew C H Jukes6, Thato Letsomo2.
Abstract
School closures affecting more than 1.5 billion children are designed to prevent the spread of current public health risks from the COVID-19 pandemic, but they simultaneously introduce new short-term and long-term health risks through lost education. Measuring these effects in real time is critical to inform effective public health responses, and remote phone-based approaches are one of the only viable options with extreme social distancing in place. However, both the health and education literature are sparse on guidance for phone-based assessments. In this article, we draw on our pilot testing of phone-based assessments in Botswana, along with the existing literature on oral testing of reading and mathematics, to propose a series of preliminary practical lessons to guide researchers and service providers as they try phone-based learning assessments. We provide preliminary evidence that phone-based assessments can accurately capture basic numeracy skills. We provide guidance to help teams (1) ensure that children are not put at risk, (2) test the reliability and validity of phone-based measures, (3) use simple instructions and practice items to ensure the assessment is focused on the target skill, not general language and test-taking skills, (4) adapt the items from oral assessments that will be most effective in phone-based assessments, (5) keep assessments brief while still gathering meaningful learning data, (6) use effective strategies to encourage respondents to pick up the phone, (7) build rapport with adult caregivers and youth respondents, (8) choose the most cost-effective medium and (9) account for potential bias in samples. © Author(s) (or their employer(s)) 2020. Re-use permitted under CC BY. Published by BMJ.Entities:
Keywords: health economics; health services research; public health
Mesh:
Year: 2020 PMID: 32699155 PMCID: PMC7380711 DOI: 10.1136/bmjgh-2020-003030
Source DB: PubMed Journal: BMJ Glob Health ISSN: 2059-7908
Figure 1Percentage of students reaching each level of question difficulty (no operations, addition, subtraction, multiplication and division) in the phone-based sample and the face-to-face Annual Status of Education Report (ASER) test of the same content. These graphs are for the same regions and largely the same set of schools and grades, but they are not matched to the exact same cohort of students. They reveal a similar distribution of learning levels using the phone and face-to-face assessments at the population level in similar geographies and ages, and they increase our confidence in phone assessment. However, this is not yet a formal validity assessment. We plan to implement that in future phone-based assessments.
Figure 2The relationship between student answers on “problem of the day” on the last day of class and average learning levels for the whole class after 15 days. Estimates were averaged at the class level within a school for a sample of 40 classes. Each individual student answered a ‘problem of the day’ in an individual booklet, which was compiled by the class teacher. If students answered problems correctly, then they progressed to more difficult items. At the end of 15 days, a more comprehensive multi-item oral assessment (the Annual State of Education Report, or ASER, assessment) was administered. In this figure, we compare the final problem-of-the-day level of difficulty with performance on the ASER test.