Literature DB >> 28096708

Evaluating medical students' proficiency with a handheld ophthalmoscope: a pilot study.

Gregory Gilmour1, James McKivigan2.   

Abstract

INTRODUCTION: Historically, testing medical students' skills using a handheld ophthalmoscope has been difficult to do objectively. Many programs train students using plastic models of the eye which are a very limited fidelity simulator of a real human eye. This makes it difficult to be sure that actual proficiency is attained given the differences between the various models and actual patients. The purpose of this article is to introduce a method of testing where a medical student must match a patient with his/her fundus photo, ensuring objective evaluation as well as developing skills on real patients which are more likely to transfer into clinical practice directly. PRESENTATION OF CASE: Fundus photos from standardized patients (SPs) were obtained using a retinal camera and placed into a grid using proprietary software. Medical students were then asked to examine a SP and attempt to match the patient to his/her fundus photo in the grid.
RESULTS: Of the 33 medical students tested, only 10 were able to match the SP's eye to the correct photo in the grid. The average time to correct selection was 175 seconds, and the successful students rated their confidence level at 27.5% (average). The incorrect selection took less time, averaging 118 seconds, yet yielded a higher student-reported confidence level at 34.8% (average). The only noteworthy predictor of success (p<0.05) was the student's age (p=0.02).
CONCLUSION: It may be determined that there is an apparent gap in the ophthalmoscopy training of the students tested. It may also be of concern that students who selected the incorrect photo were more confident in their selections than students who chose the correct photo. More training may be necessary to close this gap, and future studies should attempt to establish continuing protocols in multiple centers.

Entities:  

Keywords:  computer-based testing; education; physical exam; software based; standardized patient

Year:  2016        PMID: 28096708      PMCID: PMC5207205          DOI: 10.2147/AMEP.S119440

Source DB:  PubMed          Journal:  Adv Med Educ Pract        ISSN: 1179-7258


Introduction

Ophthalmoscopy is a skill that medical students typically are introduced to during their primary science education and then expected to master during their clinical years. Almost universally, students agree that they lack confidence and struggle with this difficult physical examination skill, and instructors have used countless teaching methods and models to attempt to bridge this perceived training gap.1–4 Models ranging from modified plastic containers to custom-made, anatomically correct plastic eyes have been tried with varying degrees of success.5,6 Distinct differences between a plastic model and an actual human patient, such as differences in dynamic pupil size, eye movement, and patient cooperation, make it difficult to be sure that skills learned on a model will translate directly to the effective examination of a patient in the clinic. Additionally, when learning ophthalmoscopy, most students have preferred practicing on humans instead of simulators.7 Other authors have incorporated fundus photographs in a kind of matching game, both live and over the internet for training.8–10 One paper even concluded that switching from conventional ophthalmoscopes to PanOptic scopes would increase proficiency but could, at a minimum, double the cost of the instrument.11 Despite all of these efforts, the level of competence in the use of the ophthalmoscope remains undefined.12 In this article, the authors used fundus photos in combination with a novel software package to establish a new method of testing that can be used as a real-time proficiency test of ophthalmoscopy skills as well as a tool that students can use to practice their skills without the assistance of an instructor.

Methods

Healthy volunteers were recruited to be standardized patients (SPs). Sixteen SPs had undilated fundus photos of both the eyes taken using a retinal camera (200Tx; Optos, Dunfermline, UK). The photos were sent off-site, where identifying information was stripped, and they were randomized. The photos were then added to the prototype software developed for this study which put them in a grid of three photos by three photos, with a total of nine photos per grid (Figure 1). An identical grid was produced in a laminated, full-color printed copy. Each photo was labeled with a random letter–number combination; the key was held off-site, where the examiner collecting the data could not access it. The software placed the corresponding SP’s photo in the grid once and filled the remaining spaces with photos of other SPs. No pathology was identified in any of the volunteers. Thirty-three medical students from Ross University School of Medicine in their third and fourth years were recruited using a facility email address list to examine an SP and attempt to select the patient’s fundus photo out of the photo grid. Before beginning, the students were asked to complete a basic questionnaire. They provided their age, gender, year of training (all students were either third-or fourth-year medical students), and whether or not they had taken a one-month elective course in ophthalmology. After making a selection, they were asked to rate their confidence in their choice. Time spent performing the exam was recorded, and the data were sent off-site for determination of the correct choices. The data from the survey and results are displayed in Table 1. Each student read a sheet of instructions before beginning, and an instructional script was read to them by the examiner before they started the exam. Only the procedural instruction was given, and no attempt to educate or instruct on technique was made. Students were allowed to examine either the left or the right eye, and the software and paper copy were adjusted to show pictures of the chosen eye. Students were allowed to use any method they chose for the examination (technique was not evaluated), give any instructions necessary to the SP, and take as much time as needed. Two identical hand-held ophthalmoscopes (Welsh-Allyn, Skaneateles Falls, NY, USA) were provided, and the students were allowed to choose one and adjust it to their preference before the timer was started. Once beginning the examination, the students were allowed to refer to the computer screen or to the hard copy of the images. The software allowed the students to use the mouse and click on photos to “gray them out” to eliminate them from consideration; they could click again to restore them for reconsideration (Figure 2).
Figure 1

A selection of nine randomized photos from 16 standardized patients were laid out on a grid and numbered with a random letter-combination.

Table 1

Participants’ responses to survey questions

Age (years)Year of trainingOphthalmology electiveGenderConfidenceTime (seconds)Answered correctly
243NoFemale0%206Yes
283NoFemale25%120No
243NoMale50%219No
253NoMale75%80No
253NoMale50%129Yes
243NoMale50%65No
253NoMale25%224No
313NoMale0%150Yes
273NoMale50%60No
273NoFemale25%110No
373NoMale50%173No
253NoFemale25%115No
263NoMale25%67No
263NoFemale50%110No
263NoMale25%183No
274YesFemale25%62No
283NoMale75%55Yes
314NoMale0%120No
343NoMale0%145Yes
243NoMale25%108No
284NoMale50%196No
514NoMale75%190Yes
284NoMale25%371Yes
383NoMale25%222Yes
243NoFemale50%87No
343NoFemale0%71Yes
284YesFemale25%115No
293NoMale50%95No
253NoMale25%163No
293NoFemale50%200No
263NoMale25%48No
243NoMale25%209Yes

Note: Students who selected the correct photo are emphasized with bolded text.

Figure 2

Clicking on a photo could deselect it to remove it from consideration.

When the students had made their final selection, they verbally reported it to the examiner. Before beginning recruitment of volunteer SPs and medical student test subjects, approval for this study was obtained from the institutional review board of St. Joseph Mercy Oakland Hospital in Pontiac, MI. Written informed consent was obtained from all participants.

Results

Upon all students completing the assessment, the results were compiled. Of the 33 students who participated, 10 correctly selected their SP’s photo out of the grid (30%) and completed the exam in an average time of 175 seconds with an average confidence rating of 27.5%. The remaining 23 students selected the incorrect photo in an average time of 118 seconds with an average confidence rating of 34.8%. Multiple regression was performed to check for significance between any of the variables and making a correct selection, as shown in Table 2. Only the medical students’ age showed a p-value that was significant, 0.023 (<0.05).
Table 2

Multiple regression of variables against correct photo selection

VariablesCoefficientsStandard errorp-Value
Intercept−0.04990.7910.950
Age0.0370.0150.023*
Year of training−0.2390.2830.406
Ophthalmology elective0.0320.4570.944
Gender−0.090.1860.630
Confidence−0.0040.0030.216
Time0.0010.0010.160

Note:

Significant p-value (p<0.05).

Additionally, a correlation matrix was assembled to look for associations between the variables. The most significant correlation was between the year of training and ophthalmology elective, which is not surprising since students further along in their training would be more likely to have taken an elective course in ophthalmology. Table 3 shows the correlation matrix.
Table 3

Correlation matrix of age, year of medical school, gender, confidence, time, and correct selection

1234567
1. Age1
2. Year0.3321
3. Elective−0.0410.5371
4. Gender−0.1440.0210.3821
5. Confidence0.099−0.005−0.104−0.1921
6. Time0.1740.250−0.191−0.195−0.1291
7. Correct0.4090.021−0.174−0.163−0.1920.3431

Discussion

This paper describes a software-based method to evaluate objectively a student’s ability to visualize the fundus of a live human patient in real time without the assistance of mydriatic eye drops. The decision not to dilate the eyes was made with the consideration that a majority of medical students are likely to end up in a primary care setting where the eyes typically are not dilated for examination. The results suggest that there is a deficit in the proficiency of the students we tested. Even when the correct photo was selected, the students expressed a very low level of confidence in their selection. Perhaps even more concerning, the average confidence level of students who made incorrect choices was higher than those who selected correctly (27.5% vs. 34.8%). Only the age of the students correlated significantly with their success in choosing the correct photo (p=0.02). This finding is likely to be a coincidental correlation as the student’s age does not have any effect on their training in a standardized medical program.

Limitations

There are several limitations to the methods used in this study. First, the generalizability of the results is questionable due to the limited sample size. A future multicenter trial would be useful. The potential cost of maintaining this training program could be high as compensation may be needed for photographing and compensating SPs. Although all the SPs used in this study were unpaid volunteers, it may be difficult to get volunteers on a larger scale. Some of the cost could be mitigated by making the students themselves act as the patients; this would also alleviate issues with scheduling the availability of SPs. Additionally, if students’ photos were used, they could employ the software as a tool to train using each other as SPs. Next, this method only demonstrates a student’s ability to visualize the fundus, and not to evaluate or identify pathology. The authors are confident that if students master the skill of visualizing the fundus, then it would be appropriate to train identification of pathology through photographs. Lastly, the use of an on-site proctor for collecting data for off-site interpretation added to the complexity of the examinations. If the software were made to randomize and interpret the data and report the results, perhaps students could use it for practice without any help or observation from a proctor.
  12 in total

1.  Funduscopy: a forgotten art?

Authors:  E Roberts; R Morgan; D King; L Clerkin
Journal:  Postgrad Med J       Date:  1999-05       Impact factor: 2.401

2.  Internet-based assessment of medical students' ophthalmoscopy skills.

Authors:  Peter Asman; Christina Lindén
Journal:  Acta Ophthalmol       Date:  2010-12       Impact factor: 3.761

3.  Medical students' self-confidence in performing direct ophthalmoscopy in clinical training.

Authors:  R R Gupta; Wai-Ching Lam
Journal:  Can J Ophthalmol       Date:  2006-04       Impact factor: 1.882

4.  Evaluation of a tool to teach medical students direct ophthalmoscopy.

Authors:  Tracy B Hoeg; Bhavna P Sheth; Dawn S Bragg; Jane D Kivlin
Journal:  WMJ       Date:  2009-02

5.  The demise of direct ophthalmoscopy: A modern clinical challenge.

Authors:  Devin D Mackay; Philip S Garza; Beau B Bruce; Nancy J Newman; Valérie Biousse
Journal:  Neurol Clin Pract       Date:  2015-04

6.  Factors associated with confidence in fundoscopy.

Authors:  Christopher Schulz; Peter Hodgkins
Journal:  Clin Teach       Date:  2014-10

7.  Matching fundus photographs of classmates. An informal competition to promote learning and practice of direct ophthalmoscopy among medical students.

Authors:  Jørgen Krohn; Bård Kjersem; Gunnar Høvding
Journal:  J Vis Commun Med       Date:  2014-04-02

Review 8.  Objectives of teaching direct ophthalmoscopy to medical students.

Authors:  Jochanan Benbassat; Bettine C P Polak; Jonathan C Javitt
Journal:  Acta Ophthalmol       Date:  2011-11-01       Impact factor: 3.761

9.  Teaching ophthalmoscopy to medical students (the TOTeMS study).

Authors:  Linda P Kelly; Philip S Garza; Beau B Bruce; Emily B Graubart; Nancy J Newman; Valérie Biousse
Journal:  Am J Ophthalmol       Date:  2013-09-13       Impact factor: 5.258

10.  The use of peer optic nerve photographs for teaching direct ophthalmoscopy.

Authors:  Behrad Y Milani; Mercede Majdi; Wesley Green; Amir Mehralian; Majid Moarefi; Freddie S Oh; Janet M Riddle; Ali R Djalilian
Journal:  Ophthalmology       Date:  2012-12-12       Impact factor: 12.079

View more
  9 in total

Review 1.  Non-simulator-based techniques in teaching direct ophthalmoscopy for medical students: a systematic review.

Authors:  Udagedara Mudiyanselage Jayami Eshana Samaranayake; Yasith Mathangasinghe; Udagedara Mudiyanselage Navami Pavithra Samaranayake; Manawattalage Wijayatunga
Journal:  Int J Ophthalmol       Date:  2020-04-18       Impact factor: 1.779

2.  Improving medical students' proficiency in ophthalmoscopy.

Authors:  Subothini Sara Selvendran; Sudeep Kumar Biswas; Nikhil Aggarwal
Journal:  Adv Med Educ Pract       Date:  2017-03-13

3.  The State of Ophthalmology Medical Student Education in the United States: An Update.

Authors:  Nathaniel R Moxon; Anju Goyal; JoAnn A Giaconi; Jamie B Rosenberg; Emily B Graubart; Evan L Waxman; Daniel Knoch; Susan H Forster; Privthi S Sankar; Rukhsana G Mirza
Journal:  Ophthalmology       Date:  2020-05-11       Impact factor: 12.079

4.  Perceived usefulness and ease of use of fundoscopy by medical students: a randomised crossover trial of six technologies (eFOCUS 1).

Authors:  H P Dunn; C J Kang; S Marks; J L Witherow; S M Dunn; P R Healey; A J White
Journal:  BMC Med Educ       Date:  2021-01-08       Impact factor: 2.463

5.  Teaching Smartphone Funduscopy with 20 Diopter Lens in Undergraduate Medical Education.

Authors:  James Kohler; Tu M Tran; Susan Sun; Sandra R Montezuma
Journal:  Clin Ophthalmol       Date:  2021-05-13

6.  Developing, conducting and evaluating the internship preparatory program (Ipp).

Authors:  Abeer S Al Shahrani; Samah F Ibrahim; Norah M AlZamil; Eman S Soliman; Lamya A Almusharraf; Amel A Fayed; Noreen Mirza
Journal:  Ann Med Surg (Lond)       Date:  2022-01-01

7.  Demographics, clinical interests, and ophthalmology skills confidence of medical student volunteers and non-volunteers in an extracurricular community vision screening service-learning program.

Authors:  Eleanor Burton; Lama Assi; Hursuong Vongsachang; Bonnielin K Swenor; Divya Srikumaran; Fasika A Woreta; Thomas V Johnson
Journal:  BMC Med Educ       Date:  2022-03-04       Impact factor: 2.463

8.  Prospective evaluation of medical student accuracy conducting direct ophthalmoscopy with an unmodified iPhone X.

Authors:  Yusuf Ahmed; Austin Pereira; Amrit S Rai; Victoria C Leung; Aadam Ahmed; Amandeep S Rai
Journal:  Int Ophthalmol       Date:  2022-07-23       Impact factor: 2.029

Review 9.  Ophthalmoscopy simulation: advances in training and practice for medical students and young ophthalmologists.

Authors:  Lucas Holderegger Ricci; Caroline Amaral Ferraz
Journal:  Adv Med Educ Pract       Date:  2017-06-29
  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.