Michael Siu Hong Wan1, Elina Tor1, Judith N Hudson2. 1. School of Medicine, The University of Notre Dame Australia, Australia. 2. Faculty of Health and Medical Sciences, University of Adelaide, Australia.
Abstract
OBJECTIVES: This study investigated whether medical student responses to Script Concordance Testing (SCT) items represent valid clinical reasoning. Using a think-aloud approach students provided written explanations of the reasoning that underpinned their responses, and these were reviewed for concordance with an expert reference panel. METHODS: A set of 12, 11 and 15 SCT items were administered online to Year 3 (2018), Year 4 (2018) and Year 3 (2019) medical students respectively. Students' free-text descriptions of the reasoning supporting each item response were analysed, and compared with those of the expert panel. Response process validity was quantified as the rate of true positives (percentage of full and partial credit responses derived through correct clinical reasoning); and true negatives (percentage of responses with no credit derived through faulty clinical reasoning). RESULTS: Two hundred and nine students completed the online tests (response rate = 68.3%). The majority of students who had chosen the response which attracted full or partial credit also provided justifications which were concordant with the experts (true positive rate of 99.6% for full credit; 99.4% for partial credit responses). Most responses that attracted no credit were based on faulty clinical reasoning (true negative of 99.0%). CONCLUSIONS: The findings provide support for the response process validity of SCT scores in the setting of undergraduate medicine. The additional written think-aloud component, to assess clinical reasoning, provided useful information to inform student learning. However, SCT scores should be validated on each testing occasion, and in other contexts.
OBJECTIVES: This study investigated whether medical student responses to Script Concordance Testing (SCT) items represent valid clinical reasoning. Using a think-aloud approach students provided written explanations of the reasoning that underpinned their responses, and these were reviewed for concordance with an expert reference panel. METHODS: A set of 12, 11 and 15 SCT items were administered online to Year 3 (2018), Year 4 (2018) and Year 3 (2019) medical students respectively. Students' free-text descriptions of the reasoning supporting each item response were analysed, and compared with those of the expert panel. Response process validity was quantified as the rate of true positives (percentage of full and partial credit responses derived through correct clinical reasoning); and true negatives (percentage of responses with no credit derived through faulty clinical reasoning). RESULTS: Two hundred and nine students completed the online tests (response rate = 68.3%). The majority of students who had chosen the response which attracted full or partial credit also provided justifications which were concordant with the experts (true positive rate of 99.6% for full credit; 99.4% for partial credit responses). Most responses that attracted no credit were based on faulty clinical reasoning (true negative of 99.0%). CONCLUSIONS: The findings provide support for the response process validity of SCT scores in the setting of undergraduate medicine. The additional written think-aloud component, to assess clinical reasoning, provided useful information to inform student learning. However, SCT scores should be validated on each testing occasion, and in other contexts.
Entities:
Keywords:
assessment; clinical reasoning; response process validity; script concordance testing; written think-aloud
Authors: Robert Gagnon; Bernard Charlin; Carole Lambert; Benoit Carrière; C Van der Vleuten Journal: Adv Health Sci Educ Theory Pract Date: 2008-05-15 Impact factor: 3.853