| Literature DB >> 30258836 |
Kathleen P Conte1,2, Penelope Hawe1,2,3.
Abstract
Electronic or digital monitoring systems could promote the visibility of health promotion and disease prevention programs by providing new tools to support the collection, analysis, and reporting of data. In clinical settings however, the benefits of e-monitoring of service delivery remain contested. While there are some examples of e-monitoring systems improving patient outcomes, the smooth introduction into clinical practice has not occurred. Expected efficiencies have not been realized. The restructuring of team work has been problematic. Most particularly, knowledge from research has not advanced sufficiently because the meaning of e-monitoring has not been well theorized in the first place. As enthusiasm for e-monitoring in health promotion grows, it behooves us to ensure that health promotion practice learns from these insights. We outline the history of program monitoring in health promotion and the development of large-scale e-monitoring systems to track policy and program delivery. We interrogate how these technologies can be understood, noticing how they inevitably elevate some parts of practice over others. We suggest that progress in e-monitoring research and development could benefit from the insights and methods of improvement science (the science that underpins how practitioners attempt to solve problems and promote quality) as conceptually distinct from implementation science (the science of getting particular evidence-based programs into practice). To fully appreciate whether e-monitoring of program implementation will act as an aid or barrier to health promotion practice we canvass a wide range of theoretical perspectives. We illustrate how different theories draw attention to different aspects of the role of e-monitoring, and its impact on practice.Entities:
Keywords: accountability; health information technology; health promotion; implementation; innovation; program monitoring; quality improvement
Year: 2018 PMID: 30258836 PMCID: PMC6145148 DOI: 10.3389/fpubh.2018.00243
Source DB: PubMed Journal: Front Public Health ISSN: 2296-2565
Glossary of terms.
| Electronic Monitoring (e-monitoring) | The use of electronic computer software or systems to conduct monitoring activities |
| Continuous Quality Improvement | “Continuous and ongoing effort to achieve measurable improvements in the efficiency, effectiveness, performance, accountability, outcomes, and other indicators of quality in services or processes which achieve and improve health of the community”( |
| Digital Health Technologies | Electronic devices used to track deliver, track, manage, and collect information used in the delivery of health services, or in endeavors to promote wellness. Used as an overarching term for multiple types of technologies that perform specific functions, e.g., electronic patient records, web-based program management and data collection systems ( |
| Health Informatics | The use of digital technologies to collect, analyze and communicate health information and data ( |
| Implementation Monitoring | The oversight of the delivery of interventions. Definitions vary, and may include some or all of the following: the delivery of components, the (number and type of) people reached, the intensity or “dose” of effort being applied, the circumstances surrounding delivery and the key milestones achieved |
| Implementation Science | “The scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and, hence, to improve the quality and effectiveness of health services” ( |
| Improvement Science | The systematic examination of the methods and factors that work best to facilitate quality improvement ( |
| Monitoring | “A continuing function that aims primarily to provide the management and main stakeholders of an ongoing intervention with early indications of progress, or lack thereof, in the achievement of results” ( |
| Quality Assurance/Quality Control | Systematic monitoring and evaluation of performance of an organization or its program to ensure that standards of quality are being met ( |
Two definitions are given to recognize the historic concern with (unwarranted) variation between different settings.
Note that this does not have to specifically include uptake of any particular evidence-based program. Improvement science has a focus on the systematic examination and interpretation of actions to improve quality and effectiveness, whereas some traditional definitions of quality control and quality improvement may be action-focused only (with less emphasis on using and adding to the science of the action).
Examples of software systems in use to support e-monitoring of health promotion implementation.
| Generic Application Software. Examples include Microsoft Office, Apple Apps, Google Drive | Microsoft Office: | Free or fee-for-service software that provides basic but customizable computing functions including word processing, spreadsheets, databases, PowerPoint, and web design. | Used to collect store, and manage data. The development of templates in these applications have been used to facilitate large-scale data collection and reporting. For examples of use, see Fernald et al. ( |
| Survey Software | Customizable survey platforms that allow users to create bespoke surveys, collect and analyse data, and create reports. | Used to collect online survey data. For examples of use, see Brownson et al. ( | |
| Project Management Software | Web-based, project management tool that provides a central platform for project partners or staff to communicate, plan, track progress, and store files. | Used by Bors et al. ( | |
| Healthy Kids, Healthy Communities Community Dashboard | Bors et al. ( | Web-based documentation and networking system designed to track progress and facilitate communication between and among administrators and recipients of the Robert Wood Johnson Foundation's Healthy Kids, Healthy Communities grant scheme. | Project-specific monitoring system that was discontinued after the end of the grant program. |
| Population Health Information Management System | Farrell et al. ( | Web-based documentation system that records adoption of key performance indicators of physical activity and nutrition policies by day care centers and primary schools. Local health districts use the system to plan, tailor and monitor local service delivery, and to report their progress to the Ministry of Health. | Developed by the New South Wales, Australia Ministry of Health, this system is used by local health districts to document and report progress in achieving the key performance indicators. |
| EvaluationWeb | Fee-for-service, customizable online data collection and reporting service. | Used by several US states' health departments to collect HIV/AIDS prevention and treatment program data and to coordinate reporting requirements for the Centers for Disease Control and Prevention (CDC). | |
| Compass by QTAC, NY | Fee-for-service, online database that collects and stores data about evidence-based chronic disease self-management workshops. Provides a list of current self-management programs, allows instructors to download required program forms and input participant data, and allows medical providers to refer patients to programs and to receive updates. Aggregates and reports data directly to CDC to fulfill reporting requirements for their grantees. | Currently in use by the state of New York health department and by the Oregon Health Authority to coordinate data collection to fulfill reporting requirements to the CDC. | |
| Quality Improvement Program Planning System (QIPPS) | Fee-for-service, online project planning and evaluation database designed specifically for health promotion project planning and management activities. | At least two Australian state governments have used QIPPS to track and report on case studies of community-based health promotion projects, see Round et al. ( | |
| DevResults | Software-as-service program for monitoring and evaluation and project management. Collects raw data and tracks progress by location against selected indicators, includes project management functions including budget tracking, task assignment, and document storage. | Used by large international development organizations for tracking international aid programs. | |
| DHIS 1 & 2 | Free, open-source, software to collect, validate, analyse and present aggregate and patient-based statistical data. Highly customizable to aggregate and track site-level data in addition to patient-level data. Mobile capability available for patients and providers to manage eHealth records. | Adopted by Kenya, Tanzania, Uganda, Rwanda, Ghana, Liberia, and Bangladesh, as their primary national health information system. Used in >40 countries. | |
Drawn from information from the companies' website in the public domain, or from literature where available.
Examples drawn from the literature where possible, and from practice-based knowledge and experience of the authors.
Software-as-service is a model in which software is centrally-hosted and supported by the vendor, and license fees are paid via a subscription.
Open-Source refers to computer software where the source code is freely accessible and available for use by the general public. It can be changed and used by anyone, for any purpose, without a license. To our knowledge, there are no examples of open-source software for e-monitoring of health promotion. The DHIS is presented as an example of open-source software used in clinical settings. Because it is open-source it is highly modifiable and could be adapted to the health promotion context.
Different philosophical research traditions observed to underpin electronic patient record research.
| Positivist | An external reality can be known and objectively measured |
| Interpretivist | “Reality” is inevitably understood/represented through the researcher's values, experience and identity |
| Critical | Research/facts/knowledge typically privilege the viewpoint of those in power. Critical perspectives challenge this |
| Recursive | “Reality” is generated within social structures that are reciprocally and recurrently reproduced by people's actions |
Following Greenhalgh et al. (.
Key theories for considering what the act of e-monitoring means in practice.
| Fit Between Individuals, Task and Technology (FITT) | Ammenwerth et al. ( | Suggests that a new technology leads to positive changes only if the attributes of the user group, the characteristics of the implemented technology, and the associated tasks match each other. | Features in Davidoff et al. ( | What are the characteristics of high users of the e-monitoring technology? |
| Institutional Theory | Scott et al. ( | Concerned with how the most deep and resilient aspects of social structures (e.g. schemas, rules, behaviors routines etc.) are created and maintained. | Of interest because in spite of the dubious effectiveness of electronic records in the health system, their transfer into health promotion will likely increase the legitimacy and authority of health promotion. | What is the highest level of authority in the state health department at which data from e-monitoring of health promotion is used? What are the ripple effects of this? |
| Practice theory | Feldman and Orlikowski ( | Examines the “constitutive” role of practices in producing organizational reality. Social life is the product of ongoing recurrent actions. | Implication is that health promotion practice will be shaped and ‘recreated' by digital implementation monitoring. | How does e-monitoring fit with existing practice? How will practice outside of the digital fields been maintained (or not)? |
| Structuration theory | Giddens ( | Considers that social structures (relationships, traditions, moral codes etc.) are the product of human agency (thoughts, decision-making, power) and vice versa. Larger social structures are the product of the repetitions of actions by individuals at micro levels. | Shares similarity with other theories, but often quoted when drawing attention to individual agency. | How is individual practitioner agency impacted by e-monitoring? What is the consequence of any shift in agency? |
| Normalization process theory (NPT) | May ( | How an organizational practice, classification, technique or artifact gets embedded into everyday life. Articulates associated mechanisms including coherence (sense-making), cognitive participation (relationship work); collective action (work to enact the practice) and reflexive monitoring (appraisal of the effects). | Considers implementation as a social process of collective action. NPT has been used extensively in understanding uptake of innovations in clinical practice and to explain factors that promote or inhibit the implementation of e-health systems. ( | What is the “talk” that accompanies e-monitoring in health promotion? How does it enact and embed the practice of monitoring? |
| Actor network theory | Callon ( | Considers how relationships between people are forged and continuously reshaped by the use of non-animate entities (like a technology). | Invites a focus on the way professional practice networks and intersectoral partnerships respond to the introduction of e-monitoring. | How has e-monitoring expanded or concentrated/centralized networks of practice? Have existing network structures influenced the adoption of e-monitoring? |
| Activity settings theory | O'Donnell et al. ( | Similar to structuration theory and practice theory, examines the everyday settings of life where the dynamic interaction of people and things produces regularized “scripts” or behaviors/practices/expectations. | Provides a systematic architecture for examining the properties of an activity setting. Unlike structuration theory, an advantage of activity settings theory for health promotion is that it provides guidance about design of ecological interventions (interventions that focus on the properties of the context, not the people in them). This architecture also provides a scheme to analyse how digital implementation monitoring impacts on some key features of the setting e.g., roles, resources, and the symbols, and time. | How has e-monitoring created new roles in practice? What is the authority and legitimacy of these roles? |
| Systems thinking | Foster-Fishman et al. ( | A system is set of objects whose interconnected linkage together forms a function or purpose. Theorizing the function and dynamics of ‘the whole' sheds light on the importance (or otherwise) of ‘the parts'. | Provides various schema to theorize the context/setting into which the electronic monitoring practice is introduced and appreciate how it might displace activity and align (or clash) with standards, normal, values, key functions, existing relationships, ongoing processes, history etc. | What resources and activities have become aligned with e-monitoring? What activities has e-monitoring displaced? |
| Complex adaptive systems thinking | Axelrod and Cohen ( | Like above, but gives emphasis to the fact that the agents in the system (people, groups) have agency and take actions (adapt) to changes around them. From the multiple adaptations/actions comes emergent (new) properties of the system such as the capacity for the system to self-organize to achieve emerging new goals. | Using complexity science, the self-organizing properties of teams can be harnessed proactively in quality improvement interventions ( | Has practice evolved new structures and routines? Is information for practice improvement being harnessed from new places? |
| Worldview theory | Geertz ( | Examines a person or groups' picture of how things (e.g. oneself, society, the nature of things) actually are. Encompasses ideals, norms and values. | Known to influence the requirement to implement tobacco control policy in hospitals ( | How does e-monitoring fit with espoused and observed values of practice? What things are practitioners willing/not willing to give up as e-monitoring gets adopted? |
| Consolidated Framework for Implementation Research (CFIR) | Damschroder et al. ( | Collates many different theories to explain the successful implementation of a new program or practice across 6 principal domains (44 constructs). These span different levels (e.g., individual, team/workplace setting, and policy environment). | Not designed to explain uptake of e-monitoring | What organizational incentives and rewards support e-monitoring? How does an individual practitioner's sense of efficacy predict e-monitoring use? |
.