| Literature DB >> 26232114 |
Douglas Murphy1, Patricia Aitchison2, Virginia Hernandez Santiago3, Peter Davey4, Gary Mires5, Dilip Nathwani6.
Abstract
BACKGROUND: Healthcare professionals need to show accountability, responsibility and appropriate response to audit feedback. Assessment of Insightful Practice (engagement, insight and appropriate action for improvement) has been shown to offer a robust system, in general practice, to identify concerns in doctors' response to independent feedback. This study researched the system's utility in medical undergraduates. SETTING AND PARTICIPANTS: 28 fourth year medical students reflected on their performance feedback. Reflection was supported by a staff coach. Students' portfolios were divided into two groups (n = 14). Group 1 students were assessed by three staff assessors (calibrated using group training) and Group 2 students' portfolios were assessed by three staff assessors (un-calibrated by one-to-one training). Assessments were by blinded web-based exercise and assessors were senior Medical School staff.Entities:
Mesh:
Year: 2015 PMID: 26232114 PMCID: PMC4522119 DOI: 10.1186/s12909-015-0406-2
Source DB: PubMed Journal: BMC Med Educ ISSN: 1472-6920 Impact factor: 2.463
Summary of tools used and process followed a
| Method of feedback | Application | Source | Prepared by |
|---|---|---|---|
| Written | End-of-Block Feedback (see Additional file | Existing Medical School Feedback | Medical School |
| ‘Spot-the-Error’ prescribing application of e-GRID Web-based interactive exercises | Examples of Patient Prescribing Chartsa (see Additional file | Developed for Study | Study Researchers |
| EMI | Extended matching item format test designed to assess the clinical application of knowledge base | Existing Medical School Feedback | Medical School |
| OSCE | 12 station clinical OSCE including consultations, examinations, procedures and data interpretation. | Existing Medical School Feedback | Medical School |
| Case Presentation | During Block | Existing Medical School Feedback | Medical School |
| PowerPoint presentation | Existing Medical School Feedback | Medical School | |
| Mini-CEX | A structured observation and feedback form used to guide evaluation of student’s consultation and/or examination skills | Existing Medical School Feedback | Medical School |
| Case Based Discussion | A structured feedback form used to guide assessment of discussion regarding a patient seen in general practice | Existing Medical School Feedback | Medical School |
| Viva Assessment | Oral examination | Existing Medical School Feedback | Medical School |
The reliabilities of individual tools are not reported here
aData on 32 drug charts were made available by pharmacists at NHS Tayside. These consisted of two drug prescription charts for each attached medical student block. Other tools used are available to medical students to include when considering their suite of individual feedback
Rating questions completed by students and coaches
| Question | Rating scale | Completed by |
|---|---|---|
| Likert 1-7*1,2 | ||
| Source of feedback highlighted: | ▪ Student participant | |
| 1. Important issues | ▪ Face-to-face coach (post-coaching session) | |
| 2. Concern in my performance | ||
| 3. Led to planned change | ||
| 4. Gave valuable feedback | ||
| Likert 1-7*1 | ▪ Anonymous coach assessor (post-coaching session) | |
| Student demonstrated: | ||
| 1. Satisfactory engagement with the TIPP process | ||
| 2. Insight into feedback provided on performance | ||
| 3. Plans for appropriate action where applicable | ||
| 4. Engagement, insight and action (global rating of | ||
| 5. Suitability for student progression recommendation | Binary yes/no | ▪ Anonymous coach assessor (post-coaching session) |
(2a) completed by student participants (pre-coaching session) and rating questions
(2a, 2b) completed by anonymous web-based portfolio assessors (post coaching session)
*1Likert scale descriptors (1–7): (1) strongly disagree; (3) disagree; (5) agree; (7) strongly agree
*2The AIP assessment has now been included in FIT as a self-assessment (see in Additional file 3)
Inter-rater Reliability of Assessment of Insightful Practice (AIP) Questions
| 3a - GROUP 1 (Calibrated Assessors) | |||
|---|---|---|---|
| AIP questions 1–3 (engagement, insight and action) 1-7 scale | AIP question 4 (global assessment) 1-7 scale | AIP question 5 (Dichotomous assessment on suitability for progression recommendation) | |
| Number of Raters | Inter-Rater Reliability (G)b | Inter-Rater Reliability (ICC)a (G)b (95 % confidence interval)c | Inter-Rater Reliability (ICC)a (G)b (95 % confidence interval)c |
|
|
| ||
|
|
| ||
|
|
| ||
|
|
|
|
|
| (n) Raters | Inter-Rater Reliability (G)b | Inter-Rater Reliability (ICC)a (G)b | Inter-Rater Reliability (ICC)a (G)b |
|
| 0.33 | 0.18 | 0.16 |
|
| 0.5 | 0.31 | 0.28 |
|
| 0.6 | 0.4 | 0.37 |
aIntraclass Correlation Coefficients (ICCs) are G-coefficients when you have a one facet design (rater)
bInter-rater reliability is the extent to which one rater’s assessments (or when based on multiple raters, the average of raters’ assessments) are predictive of another rater’s assessments
c95 % confidence intervals for reliabilities (ICCs) were calculated using Fisher’s ZR transformation which is dependent on raters (5) with a denominator value of (n-1), so cannot be calculated when only one rater. (Streiner and Norman, [9])
Fig. 1Cycle of insightful practice