Literature DB >> 18930696

STARE-HI--Statement on reporting of evaluation studies in Health Informatics.

Jan Talmon1, Elske Ammenwerth, Jytte Brender, Nicolette de Keizer, Pirkko Nykänen, Michael Rigby.   

Abstract

OBJECTIVE: Development of guidelines for publication of evaluation studies of Health Informatics applications.
METHODS: An initial list of issues to be addressed in reports on evaluation studies was drafted based on experiences as editors and reviewers of journals in Health Informatics and as authors of systematic reviews of Health Informatics studies, taking into account guidelines for reporting of medical research. This list has been discussed in several rounds by an increasing number of experts in Health Informatics evaluation during conferences and by using e-mail and has been put up for comments on the web.
RESULTS: A set of STARE-HI principles to be addressed in papers describing evaluations of Health Informatics interventions is presented. These principles include formulation of title and abstract, of introduction (e.g. scientific background, study objectives), study context (e.g. organizational setting, system details), methods (e.g. study design, outcome measures), results (e.g. study findings, unexpected observations) and discussion and conclusion of an IT evaluation paper.
CONCLUSION: A comprehensive list of principles relevant for properly describing Health Informatics evaluations has been developed. When manuscripts submitted to Health Informatics journals and general medical journals adhere to these aspects, readers will be better positioned to place the studies in a proper context and judge their validity and generalisability. It will also be possible to judge better whether papers will fit in the scope of meta-analyses of Health Informatics interventions. STARE-HI may also be used for study planning and hence positively influence the quality of evaluation studies in Health Informatics. We believe that better publication of both quantitative and qualitative evaluation studies is an important step toward the vision of evidence-based Health Informatics. LIMITATIONS: This study is based on experiences from editors, reviewers, authors of systematic reviews and readers of the scientific literature. The applicability of the principles has not been evaluated in real practice. Only when authors start to use these principles for reporting, shortcomings in the principles will emerge.

Mesh:

Year:  2008        PMID: 18930696     DOI: 10.1016/j.ijmedinf.2008.09.002

Source DB:  PubMed          Journal:  Int J Med Inform        ISSN: 1386-5056            Impact factor:   4.046


  53 in total

1.  From expert-derived user needs to user-perceived ease of use and usefulness: a two-phase mixed-methods evaluation framework.

Authors:  Mary Regina Boland; Alexander Rusanov; Yat So; Carlos Lopez-Jimenez; Linda Busacca; Richard C Steinman; Suzanne Bakken; J Thomas Bigger; Chunhua Weng
Journal:  J Biomed Inform       Date:  2013-12-12       Impact factor: 6.317

2.  STARE-HI - Statement on Reporting of Evaluation Studies in Health Informatics: explanation and elaboration.

Authors:  J Brender; J Talmon; N de Keizer; P Nykänen; M Rigby; E Ammenwerth
Journal:  Appl Clin Inform       Date:  2013-07-24       Impact factor: 2.342

3.  Evaluating a Modular Decision Support Application For Colorectal Cancer Screening.

Authors:  Laura G Militello; Julie B Diiulio; Morgan R Borders; Christen E Sushereba; Jason J Saleem; Donald Haverkamp; Thomas F Imperiale
Journal:  Appl Clin Inform       Date:  2017-02-15       Impact factor: 2.342

4.  Using the time and motion method to study clinical work processes and workflow: methodological inconsistencies and a call for standardized research.

Authors:  Kai Zheng; Michael H Guo; David A Hanauer
Journal:  J Am Med Inform Assoc       Date:  2011-04-27       Impact factor: 4.497

5.  Evaluation Considerations for Secondary Uses of Clinical Data: Principles for an Evidence-based Approach to Policy and Implementation of Secondary Analysis.

Authors:  P J Scott; M Rigby; E Ammenwerth; J Brender McNair; A Georgiou; H Hyppönen; N de Keizer; F Magrabi; P Nykänen; W T Gude; W Hackl
Journal:  Yearb Med Inform       Date:  2017-09-11

6.  Measuring the impact of health information technology.

Authors:  D Hanauer; K Zheng
Journal:  Appl Clin Inform       Date:  2012-09-12       Impact factor: 2.342

7.  Measurement error in performance studies of health information technology: lessons from the management literature.

Authors:  A S Litwin; A C Avgar; P J Pronovost
Journal:  Appl Clin Inform       Date:  2012-06-13       Impact factor: 2.342

8.  In search of dialogue and discourse in applied clinical informatics.

Authors:  G R Kim; C U Lehmann
Journal:  Appl Clin Inform       Date:  2009-10-14       Impact factor: 2.342

9.  Formative evaluation of a telemedicine model for delivering clinical neurophysiology services part I: utility, technical performance and service provider perspective.

Authors:  Patricia Breen; Kevin Murphy; Geraldine Browne; Fiona Molloy; Valerie Reid; Colin Doherty; Norman Delanty; Sean Connolly; Mary Fitzsimons
Journal:  BMC Med Inform Decis Mak       Date:  2010-09-15       Impact factor: 2.796

10.  Evaluating eHealth: how to make evaluation more methodologically robust.

Authors:  Richard James Lilford; Jo Foster; Mike Pringle
Journal:  PLoS Med       Date:  2009-11-24       Impact factor: 11.069

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.