Literature DB >> 24833946

An integrable, web-based solution for easy assessment of video-recorded performances.

Yousif Subhi1, Tobias Todsen2, Lars Konge3.   

Abstract

Assessment of clinical competencies by direct observation is problematic for two main reasons the identity of the examinee influences the assessment scores, and direct observation demands experts at the exact location and the exact time. Recording the performance can overcome these problems; however, managing video recordings and assessment sheets is troublesome and may lead to missing or incorrect data. Currently, no existing software solution can provide a local solution for the management of videos and assessments but this is necessary as assessment scores are confidential information, and access to this information should be restricted to select personnel. A local software solution may also ease the need for customization to local needs and integration into existing user databases or project management software. We developed an integrable web-based solution for easy assessment of video-recorded performances (ISEA).

Entities:  

Keywords:  assessment software; education assessment; video-based assessment

Year:  2014        PMID: 24833946      PMCID: PMC4014379          DOI: 10.2147/AMEP.S62277

Source DB:  PubMed          Journal:  Adv Med Educ Pract        ISSN: 1179-7258


Background

Clinical performance can be assessed by expert observation using assessment sheets, but this is problematic for two main reasons. Firstly, the identity of the examinee can influence the assessment.1,2 This can be due to personal relations and prejudices,1 which can be a large problem in smaller countries or within smaller medical specialties. Age and job title have also been shown to influence assessment scores.2 Secondly, assessments are time-consuming and need experts to be at a specific location and at a specific time. A feasible way of addressing these problems is by video recording the clinical performance. Recording videos can be unproblematic if designed carefully.3 However, the assessment procedure can easily become problematic and lead to logistic problems. Typically, hard drives or flash drives with videos and assessment sheets are sent to all examiners. When these are returned back to the data manager, some sheets may be incomplete and in some cases, may even be forgotten, requiring that the data manager go through a tiresome process to obtain all assessment scores. Finally, when all the cases are fully assessed, paper sheets are transcribed into a database in duplicate to avoid any typing error. This nonautomated assessment procedure is costly, time-consuming, and error-prone. Existing software enable an automated process4,5 but are not optimized for video-recorded assessments and are problematic since data are stored at external institutions; instead, data should be stored locally as assessment scores are confidential information, and access to such should be restricted to course personnel (or study personnel, in the case of research).6,7 We developed an integrable web-based solution for easy assessment of video-recorded performances (ISEA).

Design

The design of ISEA is based on two user interfaces: one for the administrator and one for the examiner. The administrator manages projects, examiners, videos, and assessment sheets. A “project” is defined as a group of examiners who assess videos based on the same assessment sheets. The assessment sheet is composed of questions, each having a number of scoring items. Within a project, examiners are assigned one or more videos that, if necessary, may differ between the examiners. Results from all assignments are available from the administrator interface. User profiles and results are stored in a local database. Depending on the legal requirements in each project, videos can either be uploaded to an external video storage platform, such as Vimeo,8 or stored internally within an institution, and ISEA is then provided with a link to the uploaded video. Examiners have their own personal page with a list of assigned videos. An assigned video is seen side-by-side with the assessment sheet, to enable simultaneous assessment (Figure 1). When the assignment is completed, it is locked and scores cannot be changed. The results are available for the administrator and can be exported in a semicolon-separated or a comma-separated file format (.csv).
Figure 1

Simultaneous view of the video and the assessment sheet, as seen by the examiner.

We used PHP: Hypertext Preprocessor (The PHP Group) and My Structured Query Language (MySQL) as the server-side scripting language and database, respectively, as these are the most popular and commonly used9,10 and may facilitate an easier implementation and integration to existing software. We wanted the examiners to be able to use our web solution in any digital device anywhere, so the interfaces were developed using very simple Hyper Text Markup Language (HTML) and Cascading Style Sheets (CSS) that are compatible with a large range of internet browsers, including smart phones and tablets. No add-on software is needed for ISEA. This clean and simple interface may ease integration to existing platforms. Adobe Dreamweaver CS5 (Adobe Systems, Inc., San Jose, CA, USA) was used for development. A download link for ISEA is available in the references.11

Discussion

Video-based assessment of clinical performance may help eliminate bias related to the identity of the examinee by masking the identity, which can be achieved by adjusting the camera-angle or by editing the video afterwards.1,12,13 However, the identity is rarely perfectly hidden – the face may be blurred and the voice may be distorted, but the identity may still be guessed if the examiner is familiar with the examinee and is aware of his or her specific characteristics, such as manner of speech, preferences of clothes, or characteristic body figure or posture. Also, in some cases, facial expression, eye contact, or the finer aspects of the voice, such as the tone, may be relevant for the assessment itself. A step toward better blinding and the reduction of biases emerging from personal relations may therefore be achieved by, not only use of videos but also, collaboration with examiners in other institutions and other countries. Video-recorded assessments provide flexibility for the examiners as they no longer need to be at a specific location at a specific time but may be at any location and assess the clinical performance at any time.12 Implementing video-based assessments is, theoretically, not a problem, but practically, this is a tiresome process with potential pitfalls. Few studies have compared direct observation with video-based assessments.13,14 Ma et al assessed central venous catheterization using both direct observation and video recordings and found no differences in ratings, and they reported that importance should be given to practical aspects.13 Kneebone et al explored the feasibility of a video-based assessment procedure with remotely placed examiners, and they also reported that a future improvement would be an automated approach for easy assessment of video-recorded performances.14 We developed ISEA, an integrable web-based solution for easy assessment of video-recorded performances that provides a format that can be stored locally, is customizable into existing systems, does not require any add-on software, and that is compatible with a large range of web browsers. ISEA may ease the assessment process of video-recorded performances and collaboration with examiners from other institutions. ISEA was used in the assessment process of a recent project,15 and the general comment from the examiners was that ISEA was easy to use. Also, ISEA can be customized to local systems, for integration to an existing project or user management systems that are used to administer courses and research. A few limitations should be noted. ISEA is an integrable application and not an “out of the box”, stand-alone application, and the implementation and customization may thus require some technical assistance. Customization, however, can be advantageous and even necessary for some types of education and research projects.7 We did not describe the implementation process in this paper as the implementation process is more technical and may vary greatly between institutions, depending on the software used. In its simplest form and without much effort, ISEA can be implemented using only a user database for the administrator and the examiners. In our experience, implementation has been unproblematic, and the limited technical assistance needed to implement ISEA is a small price to pay for a smooth and easy assessment process.15
  10 in total

1.  Is a resident's score on a videotaped objective structured assessment of technical skills affected by revealing the resident's identity?

Authors:  Val Y Vogt; Vanessa M Givens; Craig A Keathley; Gary H Lipscomb; Robert L Summitt
Journal:  Am J Obstet Gynecol       Date:  2003-09       Impact factor: 8.661

2.  A web-based repository of surgical simulator projects.

Authors:  Peter Leskovský; Matthias Harders; Gábor Székely
Journal:  Stud Health Technol Inform       Date:  2006

3.  Assessing procedural skills in context: Exploring the feasibility of an Integrated Procedural Performance Instrument (IPPI).

Authors:  R Kneebone; D Nestel; F Yadollahi; R Brown; C Nolan; J Durack; H Brenton; C Moulton; J Archer; A Darzi
Journal:  Med Educ       Date:  2006-11       Impact factor: 6.251

4.  A multifunctional online research portal for facilitation of simulation-based research: a report from the EXPRESS pediatric simulation research collaborative.

Authors:  Adam Cheng; Vinay Nadkarni; Elizabeth A Hunt; Karim Qayumi
Journal:  Simul Healthc       Date:  2011-08       Impact factor: 1.929

5.  Reliable and valid assessment of point-of-care ultrasonography.

Authors:  Tobias Todsen; Martin Grønnebæk Tolsgaard; Beth Härstedt Olsen; Birthe Merete Henriksen; Jens Georg Hillingsø; Lars Konge; Morten Lind Jensen; Charlotte Ringsted
Journal:  Ann Surg       Date:  2015-02       Impact factor: 12.969

6.  Notes From the Field: Direct Observation Versus Rating by Videos for the Assessment of Central Venous Catheterization Skills.

Authors:  Irene W Y Ma; Nadia Zalunardo; Mary E Brindle; Rose Hatala; Kevin McLaughlin
Journal:  Eval Health Prof       Date:  2014-01-12       Impact factor: 2.651

7.  Computer-assisted video evaluation of surgical skills.

Authors:  C R Beckmann; G H Lipscomb; F W Ling; C A Beckmann; H Johnson; L Barton
Journal:  Obstet Gynecol       Date:  1995-06       Impact factor: 7.661

8.  Reliable and valid assessment of competence in endoscopic ultrasonography and fine-needle aspiration for mediastinal staging of non-small cell lung cancer.

Authors:  L Konge; P Vilmann; P Clementsen; J T Annema; C Ringsted
Journal:  Endoscopy       Date:  2012-07-23       Impact factor: 10.093

9.  A web-based clinical trial management system for a sham-controlled multicenter clinical trial in depression.

Authors:  Valerie Durkalski; Catherine Dillon; Jaemyung Kim
Journal:  Clin Trials       Date:  2010-01-18       Impact factor: 2.486

10.  Internet based multi-institutional clinical research: a convenient and secure option.

Authors:  Costas D Lallas; Glenn M Preminger; Margaret S Pearle; Raymond J Leveillee; James E Lingeman; John P Schwope; Paul K Pietrow; Brian K Auge
Journal:  J Urol       Date:  2004-05       Impact factor: 7.450

  10 in total
  8 in total

1.  Optimizing Residents' Performance of Lumbar Puncture: An RCT Comparing the Effect of Preparatory Interventions on Performance and Self-Confidence.

Authors:  Mikael Johannes Vuokko Henriksen; Troels Wienecke; Helle Thagesen; Rikke Borre Vita Jacobsen; Yousif Subhi; Ryan Brydges; Charlotte Ringsted; Lars Konge
Journal:  J Gen Intern Med       Date:  2017-11-13       Impact factor: 5.128

2.  Procedure-specific assessment tool for flexible pharyngo-laryngoscopy: gathering validity evidence and setting pass-fail standards.

Authors:  Jacob Melchiors; K Petersen; T Todsen; A Bohr; Lars Konge; Christian von Buchwald
Journal:  Eur Arch Otorhinolaryngol       Date:  2018-04-17       Impact factor: 2.503

3.  Assessment of Residents Readiness to Perform Lumbar Puncture: A Validation Study.

Authors:  Mikael Johannes Vuokko Henriksen; Troels Wienecke; Helle Thagesen; Rikke Vita Borre Jacobsen; Yousif Subhi; Charlotte Ringsted; Lars Konge
Journal:  J Gen Intern Med       Date:  2017-02-06       Impact factor: 5.128

4.  "AHead Start or a Pain in theNeck?"-Establishment and Evaluation of a Video-Based "Hands-On" Head and Neck Ultrasound Course.

Authors:  Lukas Pillong; Alessandro Bozzato; Dietmar Hecker; Victoria Bozzato; Bernhard Schick; Philipp Kulas
Journal:  Diagnostics (Basel)       Date:  2022-05-16

5.  Evaluating competency in video-assisted thoracoscopic surgery (VATS) lobectomy performance using a novel assessment tool and virtual reality simulation.

Authors:  Katrine Jensen; Henrik Jessen Hansen; René Horsleben Petersen; Kirsten Neckelmann; Henrik Vad; Lars Borgbjerg Møller; Jesper Holst Pedersen; Lars Konge
Journal:  Surg Endosc       Date:  2018-09-17       Impact factor: 4.584

6.  Ensuring basic competency in chest tube insertion using a simulated scenario: an international validation study.

Authors:  Peter Hertz; Katrine Jensen; Saleh N Abudaff; Michael Strøm; Yousif Subhi; Hani Lababidi; Lars Konge
Journal:  BMJ Open Respir Res       Date:  2018-12-10

7.  Head and Neck Ultrasound - EFSUMB Training Recommendations for the Practice of Medical Ultrasound in Europe.

Authors:  Tobias Todsen; Caroline Ewertsen; Christian Jenssen; Rhodri Evans; Julian Kuenzel
Journal:  Ultrasound Int Open       Date:  2022-10-07

8.  Cross-platform digital assessment forms for evaluating surgical skills.

Authors:  Steven Arild Wuyts Andersen
Journal:  J Educ Eval Health Prof       Date:  2015-04-17
  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.