Literature DB >> 27285377

Direct Observation of Clinical Skills Feedback Scale: Development and Validity Evidence.

Samantha Halman1, Nancy Dudek1, Timothy Wood2, Debra Pugh1, Claire Touchie3, Sean McAleer4, Susan Humphrey-Murto1.   

Abstract

Construct: This article describes the development and validity evidence behind a new rating scale to assess feedback quality in the clinical workplace.
BACKGROUND: Competency-based medical education has mandated a shift to learner-centeredness, authentic observation, and frequent formative assessments with a focus on the delivery of effective feedback. Because feedback has been shown to be of variable quality and effectiveness, an assessment of feedback quality in the workplace is important to ensure we are providing trainees with optimal learning opportunities. The purposes of this project were to develop a rating scale for the quality of verbal feedback in the workplace (the Direct Observation of Clinical Skills Feedback Scale [DOCS-FBS]) and to gather validity evidence for its use. APPROACH: Two panels of experts (local and national) took part in a nominal group technique to identify features of high-quality feedback. Through multiple iterations and review, 9 features were developed into the DOCS-FBS. Four rater types (residents n = 21, medical students n = 8, faculty n = 12, and educators n = 12) used the DOCS-FBS to rate videotaped feedback encounters of variable quality. The psychometric properties of the scale were determined using a generalizability analysis. Participants also completed a survey to gather data on a 5-point Likert scale to inform the ease of use, clarity, knowledge acquisition, and acceptability of the scale.
RESULTS: Mean video ratings ranged from 1.38 to 2.96 out of 3 and followed the intended pattern suggesting that the tool allowed raters to distinguish between examples of higher and lower quality feedback. There were no significant differences between rater type (range = 2.36-2.49), suggesting that all groups of raters used the tool in the same way. The generalizability coefficients for the scale ranged from 0.97 to 0.99. Item-total correlations were all above 0.80, suggesting some redundancy in items. Participants found the scale easy to use (M = 4.31/5) and clear (M = 4.23/5), and most would recommend its use (M = 4.15/5). Use of DOCS-FBS was acceptable to both trainees (M = 4.34/5) and supervisors (M = 4.22/5).
CONCLUSIONS: The DOCS-FBS can reliably differentiate between feedback encounters of higher and lower quality. The scale has been shown to have excellent internal consistency. We foresee the DOCS-FBS being used as a means to provide objective evidence that faculty development efforts aimed at improving feedback skills can yield results through formal assessment of feedback quality.

Entities:  

Keywords:  Feedback; assessment; scale development

Mesh:

Year:  2016        PMID: 27285377     DOI: 10.1080/10401334.2016.1186552

Source DB:  PubMed          Journal:  Teach Learn Med        ISSN: 1040-1334            Impact factor:   2.414


  7 in total

1.  Competency based clinical shoulder examination training improves physical exam, confidence, and knowledge in common shoulder conditions.

Authors:  Michal Kalli Hose; John Fontanesi; Manjulika Woytowitz; Diego Jarrin; Anna Quan
Journal:  J Gen Intern Med       Date:  2017-08-07       Impact factor: 5.128

2.  Advancing Our Understanding of Narrative Comments Generated by Direct Observation Tools: Lessons From the Psychopharmacotherapy-Structured Clinical Observation.

Authors:  John Q Young; Rebekah Sugarman; Eric Holmboe; Patricia S O'Sullivan
Journal:  J Grad Med Educ       Date:  2019-10

3.  eConsult Specialist Quality of Response (eSQUARE): A novel tool to measure specialist correspondence via electronic consultation.

Authors:  Christopher Tran; Douglas Archibald; Susan Humphrey-Murto; Timothy J Wood; Nancy Dudek; Clare Liddy; Erin Keely
Journal:  J Telemed Telecare       Date:  2021-03-03       Impact factor: 6.344

4.  How does culture affect experiential training feedback in exported Canadian health professional curricula?

Authors:  Kerry Wilbur; Rasha Mousa Bacha; Somaia Abdelaziz
Journal:  Int J Med Educ       Date:  2017-03-17

5.  Longitudinal Faculty Development Program to Promote Effective Observation and Feedback Skills in Direct Clinical Observation.

Authors:  Sheira Schlair; Lawrence Dyche; Felise Milan
Journal:  MedEdPORTAL       Date:  2017-10-30

6.  The use of factor analysis and abductive inference to explore students' and practitioners' perspectives of feedback: divergent or congruent understanding?

Authors:  Christine Ossenberg; Amanda Henderson; Marion Mitchell
Journal:  BMC Med Educ       Date:  2020-11-25       Impact factor: 2.463

7.  A mobile app to capture EPA assessment data: Utilizing the consolidated framework for implementation research to identify enablers and barriers to engagement.

Authors:  John Q Young; Rebekah Sugarman; Jessica Schwartz; Matthew McClure; Patricia S O'Sullivan
Journal:  Perspect Med Educ       Date:  2020-08
  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.