| Literature DB >> 32519968 |
Kathrin Cresswell1, Robin Williams2, Aziz Sheikh1.
Abstract
BACKGROUND: There is currently a lack of comprehensive, intuitive, and usable formative evaluation frameworks for health information technology (HIT) implementations. We therefore sought to develop and apply such a framework. This study describes the Technology, People, Organizations, and Macroenvironmental factors (TPOM) framework we developed.Entities:
Keywords: evaluation; health information technology; sociotechnical
Mesh:
Year: 2020 PMID: 32519968 PMCID: PMC7315366 DOI: 10.2196/15068
Source DB: PubMed Journal: J Med Internet Res ISSN: 1438-8871 Impact factor: 5.428
Examples of existing health information technology evaluation frameworks.
| Framework | Key characteristics | Reference |
| Nonadoption, abandonment, scale-up, spread, and sustainability (NASSS) framework | This framework includes the following domains: the condition or illness, the technology, the value proposition, the adopter system, the organization(s), the wider context, and the interaction and mutual adaptation between all these domains over time. | Greenhalgh et al [ |
| Framework for Evaluation of Informatics Tools | This framework includes the following stages: specification and needs requirements, component development, integration of system into a clinical setting, and routine use of a system. | Kaufman et al [ |
| Health Information Technology Evaluation Toolkit | This framework includes the following dimensions: articulating goals of the project, understanding stakeholders, and benefits measurement. | Cusack and Poon [ |
| Health Information Systems: human, organization, and technology-fit factors (HOT-fit) | This framework focuses on the fit between technological, human, and organizational dimensions. | Yusof et al [ |
| Health Information Technology Reference-based Evaluation Framework (HITREF) | This framework includes 6 dimensions: structural quality, functional quality, effects on quality processes, effects on outcome quality of care, unintended consequences, and barriers and facilitators. | Sockolow et al [ |
Data set informing the development of the evaluation framework.
| Project | Data set | Timeline |
| National evaluation of the implementation of electronic health records in secondary care in England | 12 longitudinal qualitative case studies: 431 interviews, 590 hours of observations, 234 sets of field notes, and 809 documents | February 2009 to January 2011 |
| National evaluation of the implementation of clinical decision support/computerized physician order entry systems in English hospitals | 6 longitudinal qualitative case studies: 242 interviews, 32.5 hours of observations, and 55 documents | December 2011 to March 2016 |
| National evaluation of a pilot decision support platform in Scottish primary care | 30 interviews and 8 nonparticipant ethnographic observations | May 2018 to October 2018 |
The Technology, People, Organizations, and Macroenvironmental factors (TPOM) framework, with example descriptions of dimensions.
| Factor and dimension | Description | ||
|
| |||
|
| Usability | What is the ease of use and learnability of the technology? | |
|
| Performance | Does the technology function as intended by developers? | |
|
| Adaptability and flexibility | Can system design be changed to suit emerging needs? | |
|
| Dependability | Is the system reliable and stable? | |
|
| Data availability, integrity, and confidentiality | Is data in the system available, accessible, and usable for those who need it? | |
|
| Data accuracy | Is the data in the system accurate? | |
|
| Sustainability | Is use of the technology sustainable? | |
|
| Security | Is the system secure? | |
|
| |||
|
| User satisfaction | Who are the users? Are users satisfied with the technology? | |
|
| Complete/correct use | Are features and functionality implemented and used as intended? | |
|
| Attitudes and expectations | What benefits do users expect from using the technology and how can these be measured? | |
|
| Engagement | Are users actively engaged in implementation, adoption, and optimization? | |
|
| Experiences | Do users have negative experience with previous technologies? | |
|
| Workload/benefits | Are the benefits and efforts relatively equal for all stakeholders? | |
|
| Work processes | Does the system change relationships with patients, patterns of communication, and professional responsibilities (eg, increase of administrative tasks)? | |
|
| User input in design | Is there effective communication between designers, information technology staff, and end users, as well as between management and end users? | |
|
| |||
|
| Leadership and management | Are management structures to support the implementation adequate? | |
|
| Communication | Are aims, timelines, and strategy communicated? | |
|
| Timelines | Are implementation timelines adequate? | |
|
| Vision | What benefits do organizations expect from implementing the technology and how can these be measured? Is a coherent and realistic vision driving developments? | |
|
| Training and support | Is the training adequate and realistic? | |
|
| Champions | Are champions and boundary spanners utilized? | |
|
| Resources | Is implementation adequately resourced? (includes technology, change management, and maintenance) | |
|
| Monitoring and optimization | Is system performance and use monitored and optimized over time? Are lessons learned captured and incorporated in future efforts? | |
|
| |||
|
| Media | How is the technology viewed by the media and by the public? How does the organization view/manage media relations? | |
|
| Professional groups | How is the technology viewed by professional groups? | |
|
| Political context | What benefits do policymakers expect from the technology and how can these be measured? What is the national approach to achieving interoperability and does the system align with this? Is there a coherent vision, consistent approach, and a clear direction of travel, allowing a degree of local input? | |
|
| Economic considerations and incentives | Are there clear incentives for organizations and users to implement? (eg, improvements in quality of care) Is sufficient funding in place to support the initiative? | |
|
| Legal and regulatory aspects | Have legal and regulatory frameworks been established? | |
|
| Vendors | Is vendor management effectively organized? | |
|
| Measuring impact | Are various stakeholders working together to define, validate, test, and refine outcome measures and measurement strategies? Are outcome measures important, clinically acceptable, transparent, feasible, and usable? | |
Figure 1Diagram illustrating the evaluation framework.