Literature DB >> 28451689

A Randomized Trial Comparing Classical Participatory Design to VandAID, an Interactive CrowdSourcing Platform to Facilitate User-centered Design.

Kevin R Dufendach1, Sabine Koch, Kim M Unertl, Christoph U Lehmann.   

Abstract

BACKGROUND: Early involvement of stakeholders in the design of medical software is particularly important due to the need to incorporate complex knowledge and actions associated with clinical work. Standard user-centered design methods include focus groups and participatory design sessions with individual stakeholders, which generally limit user involvement to a small number of individuals due to the significant time investments from designers and end users.
OBJECTIVES: The goal of this project was to reduce the effort for end users to participate in co-design of a software user interface by developing an interactive web-based crowdsourcing platform.
METHODS: In a randomized trial, we compared a new web-based crowdsourcing platform to standard participatory design sessions. We developed an interactive, modular platform that allows responsive remote customization and design feedback on a visual user interface based on user preferences. The responsive canvas is a dynamic HTML template that responds in real time to user preference selections. Upon completion, the design team can view the user's interface creations through an administrator portal and download the structured selections through a REDCap interface.
RESULTS: We have created a software platform that allows users to customize a user interface and see the results of that customization in real time, receiving immediate feedback on the impact of their design choices. Neonatal clinicians used the new platform to successfully design and customize a neonatal handoff tool. They received no specific instruction and yet were able to use the software easily and reported high usability.
CONCLUSIONS: VandAID, a new web-based crowdsourcing platform, can involve multiple users in user-centered design simultaneously and provides means of obtaining design feedback remotely. The software can provide design feedback at any stage in the design process, but it will be of greatest utility for specifying user requirements and evaluating iterative designs with multiple options.

Entities:  

Keywords:  Computers; human factors and engineering; human-computer interface; informatics

Mesh:

Year:  2017        PMID: 28451689      PMCID: PMC5976485          DOI: 10.3414/ME16-01-0098

Source DB:  PubMed          Journal:  Methods Inf Med        ISSN: 0026-1270            Impact factor:   2.176


  10 in total

1.  Mixed results in the safety performance of computerized physician order entry.

Authors:  Jane Metzger; Emily Welebob; David W Bates; Stuart Lipsitz; David C Classen
Journal:  Health Aff (Millwood)       Date:  2010-04       Impact factor: 6.301

2.  Electronic Health Record Vendor Adherence to Usability Certification Requirements and Testing Standards.

Authors:  Raj M Ratwani; Natalie C Benda; A Zachary Hettinger; Rollin J Fairbanks
Journal:  JAMA       Date:  2015-09-08       Impact factor: 56.272

3.  "e-Iatrogenesis": the most critical unintended consequence of CPOE and other HIT.

Authors:  Jonathan P Weiner; Toni Kfuri; Kitty Chan; Jinnet B Fowles
Journal:  J Am Med Inform Assoc       Date:  2007-02-28       Impact factor: 4.497

4.  Research electronic data capture (REDCap)--a metadata-driven methodology and workflow process for providing translational research informatics support.

Authors:  Paul A Harris; Robert Taylor; Robert Thielke; Jonathon Payne; Nathaniel Gonzalez; Jose G Conde
Journal:  J Biomed Inform       Date:  2008-09-30       Impact factor: 6.317

5.  From user needs to system specifications: multi-disciplinary thematic seminars as a collaborative design method for development of health information systems.

Authors:  I Scandurra; M Hägglund; S Koch
Journal:  J Biomed Inform       Date:  2008-02-07       Impact factor: 6.317

6.  Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors.

Authors:  Raj M Ratwani; Rollin J Fairbanks; A Zachary Hettinger; Natalie C Benda
Journal:  J Am Med Inform Assoc       Date:  2015-06-06       Impact factor: 4.497

7.  Applying human factors principles to alert design increases efficiency and reduces prescribing errors in a scenario-based simulation.

Authors:  Alissa L Russ; Alan J Zillich; Brittany L Melton; Scott A Russell; Siying Chen; Jeffrey R Spina; Michael Weiner; Elizabette G Johnson; Joanne K Daggy; M Sue McManus; Jason M Hawsey; Anthony G Puleo; Bradley N Doebbeling; Jason J Saleem
Journal:  J Am Med Inform Assoc       Date:  2014-03-25       Impact factor: 4.497

8.  Participatory Design, User Involvement and Health IT Evaluation.

Authors:  Andre Kushniruk; Christian Nøhr
Journal:  Stud Health Technol Inform       Date:  2016

9.  A computerized provider order entry intervention for medication safety during acute kidney injury: a quality improvement report.

Authors:  Allison B McCoy; Lemuel R Waitman; Cynthia S Gadd; Ioana Danciu; James P Smith; Julia B Lewis; Jonathan S Schildcrout; Josh F Peterson
Journal:  Am J Kidney Dis       Date:  2010-08-14       Impact factor: 8.860

10.  SMART on FHIR: a standards-based, interoperable apps platform for electronic health records.

Authors:  Joshua C Mandel; David A Kreda; Kenneth D Mandl; Isaac S Kohane; Rachel B Ramoni
Journal:  J Am Med Inform Assoc       Date:  2016-02-17       Impact factor: 4.497

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.