OBJECTIVE: Development of guidelines for publication of evaluation studies of Health Informatics applications. METHODS: An initial list of issues to be addressed in reports on evaluation studies was drafted based on experiences as editors and reviewers of journals in Health Informatics and as authors of systematic reviews of Health Informatics studies, taking into account guidelines for reporting of medical research. This list has been discussed in several rounds by an increasing number of experts in Health Informatics evaluation during conferences and by using e-mail and has been put up for comments on the web. RESULTS: A set of STARE-HI principles to be addressed in papers describing evaluations of Health Informatics interventions is presented. These principles include formulation of title and abstract, of introduction (e.g. scientific background, study objectives), study context (e.g. organizational setting, system details), methods (e.g. study design, outcome measures), results (e.g. study findings, unexpected observations) and discussion and conclusion of an IT evaluation paper. CONCLUSION: A comprehensive list of principles relevant for properly describing Health Informatics evaluations has been developed. When manuscripts submitted to Health Informatics journals and general medical journals adhere to these aspects, readers will be better positioned to place the studies in a proper context and judge their validity and generalisability. It will also be possible to judge better whether papers will fit in the scope of meta-analyses of Health Informatics interventions. STARE-HI may also be used for study planning and hence positively influence the quality of evaluation studies in Health Informatics. We believe that better publication of both quantitative and qualitative evaluation studies is an important step toward the vision of evidence-based Health Informatics. LIMITATIONS: This study is based on experiences from editors, reviewers, authors of systematic reviews and readers of the scientific literature. The applicability of the principles has not been evaluated in real practice. Only when authors start to use these principles for reporting, shortcomings in the principles will emerge.
OBJECTIVE: Development of guidelines for publication of evaluation studies of Health Informatics applications. METHODS: An initial list of issues to be addressed in reports on evaluation studies was drafted based on experiences as editors and reviewers of journals in Health Informatics and as authors of systematic reviews of Health Informatics studies, taking into account guidelines for reporting of medical research. This list has been discussed in several rounds by an increasing number of experts in Health Informatics evaluation during conferences and by using e-mail and has been put up for comments on the web. RESULTS: A set of STARE-HI principles to be addressed in papers describing evaluations of Health Informatics interventions is presented. These principles include formulation of title and abstract, of introduction (e.g. scientific background, study objectives), study context (e.g. organizational setting, system details), methods (e.g. study design, outcome measures), results (e.g. study findings, unexpected observations) and discussion and conclusion of an IT evaluation paper. CONCLUSION: A comprehensive list of principles relevant for properly describing Health Informatics evaluations has been developed. When manuscripts submitted to Health Informatics journals and general medical journals adhere to these aspects, readers will be better positioned to place the studies in a proper context and judge their validity and generalisability. It will also be possible to judge better whether papers will fit in the scope of meta-analyses of Health Informatics interventions. STARE-HI may also be used for study planning and hence positively influence the quality of evaluation studies in Health Informatics. We believe that better publication of both quantitative and qualitative evaluation studies is an important step toward the vision of evidence-based Health Informatics. LIMITATIONS: This study is based on experiences from editors, reviewers, authors of systematic reviews and readers of the scientific literature. The applicability of the principles has not been evaluated in real practice. Only when authors start to use these principles for reporting, shortcomings in the principles will emerge.
Authors: Mary Regina Boland; Alexander Rusanov; Yat So; Carlos Lopez-Jimenez; Linda Busacca; Richard C Steinman; Suzanne Bakken; J Thomas Bigger; Chunhua Weng Journal: J Biomed Inform Date: 2013-12-12 Impact factor: 6.317
Authors: Laura G Militello; Julie B Diiulio; Morgan R Borders; Christen E Sushereba; Jason J Saleem; Donald Haverkamp; Thomas F Imperiale Journal: Appl Clin Inform Date: 2017-02-15 Impact factor: 2.342
Authors: P J Scott; M Rigby; E Ammenwerth; J Brender McNair; A Georgiou; H Hyppönen; N de Keizer; F Magrabi; P Nykänen; W T Gude; W Hackl Journal: Yearb Med Inform Date: 2017-09-11
Authors: Patricia Breen; Kevin Murphy; Geraldine Browne; Fiona Molloy; Valerie Reid; Colin Doherty; Norman Delanty; Sean Connolly; Mary Fitzsimons Journal: BMC Med Inform Decis Mak Date: 2010-09-15 Impact factor: 2.796