| Literature DB >> 34148508 |
Nikhil Shah1, Sharon Mathew1, Amanda Pereira1, April Nakaima1, Sanjeev Sridharan2.
Abstract
Background: The Lancet Global Health Commission (LGHC) has argued that quality of care (QoC) is an emergent property that requires an iterative process to learn and implement. Such iterations are required given that health systems are complex adaptive systems.Objective: This paper explores the multiple roles that evaluations need to play in order to help with iterative learning and implementation. We argue evaluation needs to shift from a summative focus toward an approach that promotes learning in complex systems. A framework is presented to help guide the iterative learning, and includes the dimensions of clinical care, person-centered care, continuum of care, and 'more than medicine. Multiple roles of evaluation corresponding to each of the dimensions are discussed.Entities:
Keywords: Evaluation; iterative learning; person-centred care; quality of care
Mesh:
Year: 2021 PMID: 34148508 PMCID: PMC8216261 DOI: 10.1080/16549716.2021.1882182
Source DB: PubMed Journal: Glob Health Action ISSN: 1654-9880 Impact factor: 2.640
Figure 1.Proposed framework for quality of care
Relevance of realist evaluation for iterative learning
| Tenets of realist evaluation (70) | Implication for building iterative learning for quality of care |
|---|---|
| ‘The intervention is a theory or theories’ | There is a need in the literature to specify both direct and indirect pathways by which Quality of Care interventions can impact health outcomes. A number of Quality of Care interventions are undertheorized – the mechanisms by which quality of care interventions impact outcomes are not fully described; a number of Quality of Care interventions lack a well specified theory of change. |
| ‘The intervention involves the actions of people’ | In our experience, often the technical aspects of the quality of care interventions are fully described. The role of human relationships and the human capabilities needed to enhance quality and health outcomes are not often clearly described. Further the systems that can help the ‘relationship’ dimensions of Quality of Care are not often clearly specified. |
| ‘The intervention consists of a chain of steps or processes’ | Taking a complex adaptive system view towards Quality of Care implies a need to pay attention to the linkages between the multiple dimensions of quality of care. Further, the linkages between enhanced Quality of Care and health outcomes might be dynamic. There is a need to pay attention to the dynamic linkages between the multiple dimensions of Quality of Care as well as the linkages between Quality and client-level health outcomes. |
| ‘These chains of steps or processes are often not linear, and involve negotiation and feedback at each stage.’ | The effectiveness of any one component can be enhanced through feedback between the components. From an analytical perspective, while it may make sense to separate out the multiple dimensions of Quality of Care, the actual experience of the different dimensions as experienced by the client may not be as discrete over time. |
| ‘Interventions are embedded in social systems and how they work is shaped by this context.’ | The same quality of care intervention might work very differently in different settings. Part of an iterative learning approach will require ‘principled’ adaptations to specific contexts. |
| ‘Interventions are prone to modification as they are implemented.’ | An iterative learning strategy needs to inform adaptations in the quality of care interventions over time. These adaptations can be driven by the heterogenous needs of clients that might surface over time or as a response to the 5Is of contexts (infrastructural, institutional, interpersonal, individual and intersectional components of contexts) |
| ‘Interventions are open systems and change through learning as stakeholders come to understand them.’ | There is a need for a learning system that promotes dialogue and learning between stakeholders. Without such an intentional learning system, opportunities to improve system and intervention-levels of Quality of Care can be lost. |
Questions to guide a comprehensive approach to QoC evaluations
| Evaluation focus | Questions for reflection |
|---|---|
| Purpose of evaluation | Does the QoC intervention need further development? If yes, how can evaluations help? Is the intervention ‘mature’ for a summative evaluation? If not, what are areas that need further development? If yes, what kinds of impacts will be explored in the evaluation? Is there a need for a mixed evaluation approach with an initial developmental approach and then movement to a summative evaluation? |
| Information and learning needs | Who are the key stakeholders? What do they need to learn? Understand the types of information that evaluation can provide that can help with implementation and learning |
| Theory of change | Does the intervention have multiple components (focussing on multiple dimensions) or is the focus mainly on just one dimension? Are the linkages between the different components clearly described? Is there ‘coherence’ in the overall causal package? Is there a clear theory of change of the intervention? Are the mechanisms by which QoC interventions can impact health outcomes clearly specified? Is there clarity about the types of contexts/support structures that will facilitate the linkages between QoC dimensions and health outcomes? |
| Context mapping | Is there clarity on the contexts that either support or hinder the implementation of the QoC interventions? How will the intervention need to be adapted to the contexts? What infrastructural, institutional, interpersonal, individual and intersectional contexts does the program need to respond to? |
| Capacity mapping | Are the human, technical and administrative capacities of the program sufficient to implement the multiple interventions? Can the evaluations help identify in which dimensions are the capacities insufficient? Can key capacities be enhanced by actions from other actors in the system? |
| Anticipated impacts | How long will it take for the QoC intervention to improve health outcomes? Are there ideas on the anticipated trajectory/shape of the impacts? Will the proposed timeframe of the evaluation match the anticipated timeline of impacts? |
| Implementation fidelity and adaptation | Are sufficient adaptations being made to adapt to contexts? Is implementation integrity being maintained over time? Does the evaluation provide any information to help with adaptations? |
| Impacts | What evaluation designs are being implemented to study the impact over time of the QoC interventions on health outcomes? Are the impacts stable over time? Does the intervention impact different social groups differentially? Is there evidence of health equity impacts? Does the evaluation help develop clarity on the key drivers of impacts? Does the evaluation shed light on the contexts necessary for maximal impacts? Does the evaluation shed light on ‘what works for whom’? |
| Planning for sustainability | Are there explicit plans to sustain the intervention? How can the evaluation help with making decisions to sustain the intervention? Can the intervention be institutionalized? Can key learnings about effective mechanisms be mainstreamed? |
| Scaling-up | Is there a need in the overall system to implement such an intervention? Do the impact results support spreading of the intervention widely? Does the overall system have the capacity to implement this intervention? Can the mechanisms be spread across the system? |
| System-level learning and feedback | What can be done to improve each component of the QoC intervention? How can the linkages between the multiple QoC components be strengthened? How can the linkages between organizations in the system be strengthened? Are some key organizations missing from the delivery network? Can the linkages with community organizations be strengthened? What is a learning and communication system that can promote a learning culture around person-centered care? |
| Implications for evaluation of patient-centred care |
|---|
| An evaluation approach to person-centered care will help identify the choices and preferences of clients. In systems that are poorly resourced and under capacity, individuals might not feel empowered to express their choices and preferences. In such cases, evaluation may have a role to play in helping create conditions for patients to express their voice regarding their own choices and preferences. This is a hard problem because, in our experience, the shift from patients as recipients of healthcare to being active players in the co-production of their health requires a corresponding shift in values from the multiple actors involved in the delivery of healthcare. Evaluation itself might have a role to play in helping create awareness among healthcare providers of the importance of clients’ voice. Different evaluation methodologies might be needed to better understand patient preferences. What are the processes by which women, who have not formerly been regular users of healthcare systems, feel empowered to express their preferences and choices? What kinds of spaces promote dialogue and understanding between healthcare providers and clients? What are ways in which there can be non-confrontational enhancements of dignity and respect toward clients? |
| Each of these questions must be addressed in a contextually informed way, paying attention to the political and cultural constraints of specific healthcare settings and participants. |
| Implications for evaluation of continuum of care |
|---|
| In our experience, it is rare that the continuum of care in a setting is clearly mapped. A key role for evaluations is thus to more clearly map the continuum as well as the gaps in such a continuum. One important purpose of evaluations will involve following women longitudinally across their journeys What are critical gaps in the continuum of care? Are disadvantaged areas especially deficient in the connectivity across the continuum? What are solutions to enhance the connectivity of especially disadvantaged clients across the continuum? |
| Evaluations can help both in understanding which types of interventions can assist in navigating the continuum, as well as the spatial [ |
| Implications for evaluation of more than medicine |
|---|
| An evaluation of the ‘more than medicine’ dimension will explore the if and how care being delivered in a facility is sensitive to the kinds of support systems that women have at home or in the community. Evaluations can also explore if women whose trajectories of outcomes are more favorable also live in households/communities that have greater supports. From a more developmental perspective, an evaluation could also explore how care in the facility needs to understand and incorporate the types of support systems that exist both within the household and within the community within the care process [ |
| Implications for evaluation of technical quality |
|---|
| While evaluations can help identify whether good quality of care is associated with longer term outcomes, it is helpful to adopt a longitudinal evaluation approach in order to understand if good quality clinical care has differential trajectories of outcomes associated with different types of clients. There is value in assessing potential inequities in impacts in good quality clinical care for different types of clients. Additionally, evaluation can help understand the longitudinal relationship between the different dimensions of quality of care to longer term outcomes. Put differently, what proportion of longer-term outcomes are driven purely by the clinical dimension of care? Does technical quality have differential impacts on differential social groups due to a range of other factors? Even for this technical dimension, there is value in both conducting a summative evaluation, as well as addressing questions that explore the connections between this dimension of quality to other dimensions of quality. |