Literature DB >> 16925635

The usability of personal digital assistants (PDAs) for assessment of practical performance.

Ina Treadwell1.   

Abstract

The administration of an objective structured clinical examination (OSCE) using paper checklists presents problems such as illegible handwriting, missing student names and/or numbers and lost checklists. Calculating and entering results is not only time-consuming, but is subject to human errors, and feedback to students is rarely available. To rectify these problems, personal digital assistants (PDAs) and HaPerT software were acquired to replace paper checklists and provide automated results and feedback. This study sought to determine the usability of the PDA assessment system. The usability of the PDA system was evaluated according to effectiveness, efficiency and user satisfaction. Effectiveness was judged by comparing the results of an OSCE conducted in 2003 (paper-based method) and repeated in 2004 and 2005 (PDA method). Efficiency was determined by calculating the amount of time required to organise the logistics for 2 consecutive PDA OSCEs and deliver results and grades, compared with the time required for the paper-based OSCE. User satisfaction was established by using questionnaires to obtain feedback on the assessors' experiences during their first assessments. An independent groups t-test used to compare the means of scores achieved by students in the PDA and paper-based OSCEs, respectively, showed that the difference in effectiveness was not significant. In terms of efficiency, 77% less time was used for the PDA OSCE in 2004 and 93% less in 2005. Assessor feedback on PDA assessment was overwhelmingly positive. Assessment by PDA was found to be just as effective as and more efficient than paper-based assessment in practical examinations, and was highly rated by assessors.

Entities:  

Mesh:

Year:  2006        PMID: 16925635     DOI: 10.1111/j.1365-2929.2006.02543.x

Source DB:  PubMed          Journal:  Med Educ        ISSN: 0308-0110            Impact factor:   6.251


  6 in total

1.  Web-based education for low-literate parents in Neonatal Intensive Care Unit: development of a website and heuristic evaluation and usability testing.

Authors:  Jeungok Choi; Suzanne Bakken
Journal:  Int J Med Inform       Date:  2010-08       Impact factor: 4.046

2.  Evaluation of satisfaction and use of electronic intervention for behavior change.

Authors:  Denise Maierle; Polly Ryan
Journal:  Comput Inform Nurs       Date:  2011-11       Impact factor: 1.985

3.  Usability and preference of electronic vs. paper and pencil OSCE checklists by examiners and influence of checklist type on missed ratings in the Swiss Federal Licensing Exam.

Authors:  Felicitas L Wagner; Sabine Feller; Felix M Schmitz; Philippe G Zimmermann; Rabea Krings; Sissel Guttormsen; Sören Huwendiek
Journal:  GMS J Med Educ       Date:  2022-04-14

4.  Tablet versus paper marking in assessment: feedback matters.

Authors:  Alan Denison; Emily Bate; Jessica Thompson
Journal:  Perspect Med Educ       Date:  2016-04

5.  Innovative Method to Digitize a Web-Based OSCE Evaluation System for Medical Students: A Cross-Sectional Study in University Hospital in Saudi Arabia.

Authors:  Abdullah A Yousef; Bassam H Awary; Faisal O AlQurashi; Waleed H Albuali; Mohammad H Al-Qahtani; Syed I Husain; Omair Sharif
Journal:  Int J Gen Med       Date:  2022-02-03

6.  The efficiency and effectiveness of surgery information systems in Iran.

Authors:  Faezeh Abbasi; Reza Khajouei; Moghaddameh Mirzaee
Journal:  BMC Med Inform Decis Mak       Date:  2020-09-16       Impact factor: 2.796

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.