Literature DB >> 25893938

The IDEA Assessment Tool: Assessing the Reporting, Diagnostic Reasoning, and Decision-Making Skills Demonstrated in Medical Students' Hospital Admission Notes.

Elizabeth A Baker1, Cynthia H Ledford, Louis Fogg, David P Way, Yoon Soo Park.   

Abstract

UNLABELLED: Construct: Clinical skills are used in the care of patients, including reporting, diagnostic reasoning, and decision-making skills. Written comprehensive new patient admission notes (H&Ps) are a ubiquitous part of student education but are underutilized in the assessment of clinical skills. The interpretive summary, differential diagnosis, explanation of reasoning, and alternatives (IDEA) assessment tool was developed to assess students' clinical skills using written comprehensive new patient admission notes.
BACKGROUND: The validity evidence for assessment of clinical skills using clinical documentation following authentic patient encounters has not been well documented. Diagnostic justification tools and postencounter notes are described in the literature (1,2) but are based on standardized patient encounters. To our knowledge, the IDEA assessment tool is the first published tool that uses medical students' H&Ps to rate students' clinical skills. APPROACH: The IDEA assessment tool is a 15-item instrument that asks evaluators to rate students' reporting, diagnostic reasoning, and decision-making skills based on medical students' new patient admission notes. This study presents validity evidence in support of the IDEA assessment tool using Messick's unified framework, including content (theoretical framework), response process (interrater reliability), internal structure (factor analysis and internal-consistency reliability), and relationship to other variables.
RESULTS: Validity evidence is based on results from four studies conducted between 2010 and 2013. First, the factor analysis (2010, n = 216) yielded a three-factor solution, measuring patient story, IDEA, and completeness, with reliabilities of .79, .88, and .79, respectively. Second, an initial interrater reliability study (2010) involving two raters demonstrated fair to moderate consensus (κ = .21-.56, ρ =.42-.79). Third, a second interrater reliability study (2011) with 22 trained raters also demonstrated fair to moderate agreement (intraclass correlations [ICCs] = .29-.67). There was moderate reliability for all three skill domains, including reporting skills (ICC = .53), diagnostic reasoning skills (ICC = .64), and decision-making skills (ICC = .63). Fourth, there was a significant correlation between IDEA rating scores (2010-2013) and final Internal Medicine clerkship grades (r = .24), 95% confidence interval (CI) [.15, .33].
CONCLUSIONS: The IDEA assessment tool is a novel tool with validity evidence to support its use in the assessment of students' reporting, diagnostic reasoning, and decision-making skills. The moderate reliability achieved supports formative or lower stakes summative uses rather than high-stakes summative judgments.

Entities:  

Keywords:  assessment; clinical documentation review; clinical reasoning; medical student

Mesh:

Year:  2015        PMID: 25893938     DOI: 10.1080/10401334.2015.1011654

Source DB:  PubMed          Journal:  Teach Learn Med        ISSN: 1040-1334            Impact factor:   2.414


  7 in total

1.  Development and Validation of a Machine Learning Model for Automated Assessment of Resident Clinical Reasoning Documentation.

Authors:  Verity Schaye; Benedict Guzman; Jesse Burk-Rafel; Marina Marin; Ilan Reinstein; David Kudlowitz; Louis Miller; Jonathan Chun; Yindalon Aphinyanaphongs
Journal:  J Gen Intern Med       Date:  2022-06-16       Impact factor: 6.473

2.  Development of a Clinical Reasoning Documentation Assessment Tool for Resident and Fellow Admission Notes: a Shared Mental Model for Feedback.

Authors:  Verity Schaye; Louis Miller; David Kudlowitz; Jonathan Chun; Jesse Burk-Rafel; Patrick Cocks; Benedict Guzman; Yindalon Aphinyanaphongs; Marina Marin
Journal:  J Gen Intern Med       Date:  2021-05-04       Impact factor: 5.128

3.  Analyzing the effectiveness of teaching and factors in clinical decision-making.

Authors:  Ming-Chen Hsieh; Ming-Shinn Lee; Tsung-Ying Chen; Tsuen-Chiuan Tsai; Yi-Fong Pai; Min-Muh Sheu
Journal:  Ci Ji Yi Xue Za Zhi       Date:  2017 Oct-Dec

Review 4.  Situational awareness within objective structured clinical examination stations in undergraduate medical training - a literature search.

Authors:  Markus A Fischer; Kieran M Kennedy; Steven Durning; Marlies P Schijven; Jean Ker; Paul O'Connor; Eva Doherty; Thomas J B Kropmans
Journal:  BMC Med Educ       Date:  2017-12-21       Impact factor: 2.463

5.  Assessing Clinical Reasoning: Targeting the Higher Levels of the Pyramid.

Authors:  Harish Thampy; Emma Willert; Subha Ramani
Journal:  J Gen Intern Med       Date:  2019-08       Impact factor: 5.128

6.  Self-Directed Rater Training for Pediatric History and Physical Exam Evaluation (P-HAPEE) Rubric, a Validated Written H&P Assessment Tool.

Authors:  Marta A King; Carrie A Phillipi; Paula M Buchanan; Linda O Lewin
Journal:  MedEdPORTAL       Date:  2017-07-21

Review 7.  Evaluating the Clinical Reasoning of Student Health Professionals in Placement and Simulation Settings: A Systematic Review.

Authors:  Jennie Brentnall; Debbie Thackray; Belinda Judd
Journal:  Int J Environ Res Public Health       Date:  2022-01-14       Impact factor: 3.390

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.