Literature DB >> 28228948

Health system frameworks and performance indicators in eight countries: A comparative international analysis.

Jeffrey Braithwaite1, Peter Hibbert1, Brette Blakely1, Jennifer Plumb2, Natalie Hannaford3, Janet Cameron Long1, Danielle Marks1.   

Abstract

OBJECTIVES: Performance indicators are a popular mechanism for measuring the quality of healthcare to facilitate both quality improvement and systems management. Few studies make comparative assessments of different countries' performance indicator frameworks. This study identifies and compares frameworks and performance indicators used in selected Organisation for Economic Co-operation and Development health systems to measure and report on the performance of healthcare organisations and local health systems. Countries involved are Australia, Canada, Denmark, England, the Netherlands, New Zealand, Scotland and the United States.
METHODS: Identification of comparable international indicators and analyses of their characteristics and of their broader national frameworks and contexts were undertaken. Two dimensions of indicators - that they are nationally consistent (used across the country rather than just regionally) and locally relevant (measured and reported publicly at a local level, for example, a health service) - were deemed important.
RESULTS: The most commonly used domains in performance frameworks were safety, effectiveness and access. The search found 401 indicators that fulfilled the 'nationally consistent and locally relevant' criteria. Of these, 45 indicators are reported in more than one country. Cardiovascular, surgery and mental health were the most frequently reported disease groups.
CONCLUSION: These comparative data inform researchers and policymakers internationally when designing health performance frameworks and indicator sets.

Entities:  

Keywords:  Epidemiology/public health; health performance; international comparison; performance frameworks; performance indicators; quality of care

Year:  2017        PMID: 28228948      PMCID: PMC5308535          DOI: 10.1177/2050312116686516

Source DB:  PubMed          Journal:  SAGE Open Med        ISSN: 2050-3121


Introduction

For more than two decades, regulators, policymakers, researchers and clinicians have endeavoured to improve the quality of healthcare by designing and applying indicators of performance. There are national and international incentives for rating the performance of health systems. The World Health Organisation (WHO)[1] and others[2,3] have attempted to rank health systems for the insights gained from global comparisons, while consumers have an interest in selecting the best provider for treatment for their particular condition and knowing that their taxes are being spent wisely.[4] To meet these multiple demands, performance indicators (‘measurable elements of practice performance for which there is evidence or consensus that they can be used to assess the quality, and hence change of quality, of care provided’) and performance frameworks (‘conceptual frameworks that set out the rationale and design principles for an indicator set’)[5,6] are typically designed to routinely monitor aspects of healthcare performance such as effectiveness, efficiency, safety and quality.[7] The quest for a single composite indicator of quality, prevalent in the early days of indicator development, has largely been abandoned in favour of multidimensional frameworks.[1] Indicator sets commonly contain a combination of structure, process and outcome assessments.[8] The Organisation for Economic Co-operation and Development (OECD) publishes 60 internationally comparable indicators of healthcare quality.[9] These are useful and influential. However, many countries, even those with advanced data systems, have difficulty linking practice performance to outcomes because of limitations in data availability and poor capabilities to link data. Notwithstanding these kinds of shortcomings, it is useful to assess the frameworks and performance indicators in a sample of countries for the insights this provides. Some health systems have moved faster than others in adopting performance indicators as tools for quality improvement and have made details of their indicators and systems for applying them publicly available at national, regional or institutional levels. We selected eight prominent health systems for review and assessment based on purposively selecting a sample that were exemplars in using indicators and making their data and performance systems available: Australia, Canada, Denmark, England, the Netherlands, New Zealand, Scotland and the United States. At the time of our review, all had made progress in selecting or applying indicators to measure or stimulate improved performance and most had developed a framework for conceptualising performance improvement or indicator use. This research aims to identify and analyse indicators and their frameworks which report on the performance of healthcare organisations and local health areas. This will provide comparative cases and information on progress for the benefit of regulators, policymakers and researchers within those countries and elsewhere, but is of particular use to policymakers interested in constructing future frameworks.

Methods

We searched for relevant performance indicators and their domains across the eight countries. Following this, we analysed performance indicators that were nationally consistent (used across the country rather than just regionally), locally relevant (measured and reported publicly at a local level, for example, a health service) and measured patient-level metrics. We conducted our study in four stages.

Stage 1: identify comparable nations using performance indicators to monitor and report on healthcare

To make comparison relevant, all selected countries are OECD members and have been classified by the World Bank as high income.[10] Data on the rates of health expenditure and life expectancy for 19 countries were obtained from Australia’s Health 2010[11] and from OECD reports[9] (including Australia, Austria, Canada, Denmark, Finland, France, Germany, Greece, Ireland, Italy, Japan, the Netherlands, New Zealand, Norway, Portugal, Spain, Sweden, the United Kingdom and the United States).[12] After screening by the research team, the eight countries we noted above were selected for detailed review on the basis that each had made substantial progress in using indicators and developing performance frameworks and had made their indicators and performance frameworks widely available.

Stage 2: finding performance indicators

We conducted our Internet search of performance indicator systems in the eight comparator countries in May 2013. The OECD and Departments or Ministries of Health and associated government health organisations in each country were searched. A scoping table detailing the indicators by country was developed. Indicators that were collected consistently on a national scale were included, but could be relevant and useful to local quality improvement efforts. The purpose of the table was to compile an initial ‘long-list’ of available indicators and then to identify a ‘short-list’ of those reported in multiple countries.

Stage 3: detailed review of selected performance indicators

The performance indicators were subjected to a detailed assessment and were classified according to whether they applied to community/hospital/population, country of origin and clinical specialty.

Stage 4: country-specific frameworks

The health system performance frameworks for each country were reviewed together with their accompanying online and published documentation. Domains within the performance frameworks were compared.

Results

Performance indicator frameworks

A summary of each country’s approach to performance indicator use is shown in Tables 1 and 2. Most of the eight countries have an overarching framework for the selection and reporting of indicators which establish the broader aims for their implementation activity and play a large role in indicator selection and use. The number and focus of frameworks varied greatly between the eight countries, but typically included reference to both monitoring and improving quality and efficiency of the healthcare system. There appears to be considerable overlap between the definitions of many of the domains such as effectiveness and appropriateness. Indicators are sometimes also used to promote consumer choice at a regional or local level.
Table 1.

Summary of the eight countries: demographic characteristics and health performance frameworks.

AustraliaCanadaDenmarkEnglandThe NetherlandsNew ZealandScotlandUnited States
Estimated population (rank)[a,13]22,262,501 (55)34,568,211 (37)5,556,452 (11)53,900,000[14] (22)[b]16,807,037 (64)4,365,113 (125)5,300,000[14] (22)[b]316,668,567 (3)
Life expectancy at birth: overall years (rank)82 (10)82 (13)79 (48)80[b] (30)81 (21)81 (25)80[b] (30)[b]79 (51)
Infant mortality: deaths per 1000 live births (rank)4.49 (190)4.78 (182)4.14 (197)4.5[b] (189)[b]3.69 (205)4.65 (145)4.5[b] (189)5.9 (174)
GDP ($US) (rank)986.7 billion (19)1.513 trillion (14)213.6 billion (55)2.375 trillion[b] (9)[b]718.6 billion (24)134.2 billion (64)2.375 trillion[b] (9)[b]15.94 trillion (2)
GDP per capita ($US) (rank)42,400 (94)43,400 (142)38,300 (32)37,500[b] (34)[b]42,900 (23)30,200 (50)37,500[b] (34)[b]50,700 (14)
Healthcare expenditure (%GDP) (rank)8.7 (2010) (48)11.3 (15)11.4 (14)9.6[b] (32)[b]11.9 (7)10.1 (30)9.6[b] (32)[b]17.9 (2)
Type of health systemUniversal coverage – MedicareVoluntary private insurance availablePublicly funded – Medicare provides universal coverage for all hospital and physician services out-of-pocket expenses dental, optometry and pharmaceuticalsVoluntary private insurance availablePublicly funded – out-of-pocket expenses dental, optometry and pharmaceuticalsVoluntary private insurance availablePublicly funded – NHSVoluntary private insurance availableUniversal coverage ensured – mix of public and private insurancePublicly fundedVoluntary private insurance availablePublicly funded – NHSVoluntary private insurance availablePublic and private insurance – majority private insurance
Health system performance frameworksPAF and ROGS provide key conceptual principlesFramework is conceptualised across four dimensions: (1) health status, (2) non-medical determinants of health, (3) health system performance and (4) community and health system characteristicsNo framework as yetNHS Outcomes FrameworkCCG Outcomes Indicator SetQOFOverarching framework to meet four needs: (1) staying healthy, (2) getting better, (3) living independently with a chronic illness and (4) end-of-life careSix health targets, three focus on patient access and three on prevention. Primary Health Organisation targetsQuality and Safety markersAtlas of Healthcare Variation12 Quality Outcome Indicators; HEAT targets and other measurements at local and national levelsTwo locally reportedThe Commonwealth Fund – no set framework reports across health dimensions (see below)Hospital Compare – no framework reports on seven dimensions (see below)
Dimensions/domains reportedPAF – safety, effectiveness, appropriateness, quality, access, efficiency, equity, competence, capability, continuity, responsiveness, sustainabilityROGS – effectiveness, appropriateness, quality, access, efficiency, equityEight domains:(a) acceptability, (b) accessibility, (c) appropriateness, (d) competence(e) continuity, (f) effectiveness, (g) efficiency and (h) safetyUnder developmentNHS Outcomes – five domains: premature death, quality of life, recovery, positive experience and care/safetyCCG – adds to the overarching NHS OutcomesQAO Framework – four domains – clinical, organisational, patient care experiences and additional servicesThree overarching themes:(1) quality of care, (2) access to care and (3) healthcare expenditureDiverse themes. Atlas domains: maternity, gout, demography, cardiovascular disease, polypharmacy and surgical proceduresDescribed as Quality Ambitions: safe, person-centred and effectiveThe Commonwealth Fund – four domains: access, prevention and treatment, costs and potentially avoidable hospital use and health outcomesHospital Compare – seven dimensions – general information, timely and effective care, readmissions, complications and death, use of medical imaging, survey of patients’ experiences, Medicare payment and number of Medicare patients
Framework purposeROGS, PAF:- to support improved local-level performance assessment- to support a safe, high-quality Australian health system, through improved transparency and accountabilityTo determine (1) the health of Canadians and (2) how well the health system performs and operates on the principles of providing report that is secure, that respects Canadians’ privacy and is also consistent, relevant, flexible, integrated, user-friendly and accessibleN/ANHS Outcomes Framework and CCG Outcomes Indicator Set:- to provide a national-level overview of how well the NHS is performing- to provide an accountability mechanism between the Secretary of State for Health and the NHS Commissioning Board for the effective spend of some £95 billion of public money- to act as a catalyst for driving up quality throughout the NHS by encouraging a change in culture and behaviourQOF is not about performance management per se, but incentivising and rewarding good practiceUsed to compare healthcare system performance in other years and countries, with policy and procedure and where possible between healthcare providersHealth targets are a set of national performance measures designed to improve the performance of health services that reflect significant public and government priorities. They provide a focus for action for DHBs and are focussed on accountability not quality improvement.Primary Health Organisation targets to improve the health of enrolled populations and reduce inequalities in health outcomes through supporting clinical governance and rewarding quality improvement within PHOs. Improvements in performance against a range of nationally consistent indicators result in incentive payments to PHOs.QSMs will be used to evaluate the success of the national safety campaign and determine whether the desired changes in practice and reductions in harm and cost have occurred.Atlas of Healthcare Variation: aims to stimulate debate by highlighting variation, rather than making judgements about why variation exists or whether it is appropriate, leading to improvements in healthcare services.To structure and coordinate the range of measurements that are taken across the NHS Scotland.- 12 QOIs are used for national reporting on longer term progress towards the Quality Ambitions and the Quality Outcomes. These are intended as indicators of quality and do not have associated targets.- HEAT targets describe the specific and short-term priority areas for focussed action in support of the Quality Outcomes.Commonwealth Fund: uses comparative data to assess the performance of their healthcare systems, establishes priorities for improvement and sets achievement targetsHospital Compare: to help stimulate and support improvements in the quality of care delivered by Medicare hospitals, with the intention of improving hospitals’ quality of care through the distribution of objective, easy to understand data on hospital performance and quality information from consumer perspectives
Data sourcesMultiple data sources as identified in the data plan 2-13-2016Australian Institute of Health and Welfare national data holdingsNational Partnership Agreement Data submissionsAustralian Bureau of Statistics dataOther collectionsStatistics Canada CIHICanadian Hospital Reporting ProjectClinical Quality Development Programme (RKKP), individual registries and databases, Sundhed.dkHealth and Social Care Information Centre; Royal College of PhysiciansDutch Hospital DatabankHealth Quality and Safety Commission/Atlas of healthcare variationPrimary Health Organisation Performance ProgrammeInformation Services Division ScotlandScottish GovernmentMain sources of data include Centre for Medicare and Medicaid, The Joint Commission, Centers for Disease Control and Prevention and other Medicare data and data from within Hospital Referral Regions

DBH: District Health Boards; GDP: gross domestic product; NA: Nott applicabel; NHS: National Health Service; QAO; Quality and Outcomes; PAF: Performance and Accountability Framework; PHO: Primary Health Organisations; ROGS: Report on Government Services; QOF: Quality and Outcomes Framework; CCG: Clinical Commissioning Group; QOI: Quality Outcome Indicator; QSM: quality and safety marker; CIHI: Canadian Institute for Health Information.

Rank refers to CIA World Factbook,[15] country compared to the rest of the world.

Figures for UK in total, not England or Scotland specifically.

Table 2.

Domains of performance indicators by country.

Australia – PAFAustralia – ROGSEngland – High Quality Care for AllEngland – NHS Outcomes FrameworkCanada – Canadian Health Indicator FrameworkThe Netherlands – dimensions of healthcare performanceThe Netherlands – healthcare needsScotland – Quality Measurement FrameworkUnited States – Agency for Healthcare Research and QualityUnited States – Commonwealth FundOECDTotal
EffectivenessXXXXXXXX8
AccessXXXXXXX7
SafetyXXXXXXX7
EfficientXXXX (‘Efficiency and governance’)X5
QualityXXXX4
AppropriatenessXXXX4
Outcomes of care/health improvementThree domains relate to outcomesFour domains relate to outcomesXX4
Patient-centred/experienceXXXX4
CostXXX3
EquityXXX3
ResponsivenessXXX3
Competence/capabilityXX2
ContinuityXX2
TimelyX1
AcceptabilityX1
SustainabilityX1
Avoidable hospital useX1

PAF: Performance and Accountability Framework; ROGS: Report on Government Services; NHS: National Health Service; OECD: Organisation for Economic Co-operation and Development.

Summary of the eight countries: demographic characteristics and health performance frameworks. DBH: District Health Boards; GDP: gross domestic product; NA: Nott applicabel; NHS: National Health Service; QAO; Quality and Outcomes; PAF: Performance and Accountability Framework; PHO: Primary Health Organisations; ROGS: Report on Government Services; QOF: Quality and Outcomes Framework; CCG: Clinical Commissioning Group; QOI: Quality Outcome Indicator; QSM: quality and safety marker; CIHI: Canadian Institute for Health Information. Rank refers to CIA World Factbook,[15] country compared to the rest of the world. Figures for UK in total, not England or Scotland specifically. Domains of performance indicators by country. PAF: Performance and Accountability Framework; ROGS: Report on Government Services; NHS: National Health Service; OECD: Organisation for Economic Co-operation and Development. In Australia, the National Health Performance Authority (NHPA)[16] was established under the Australian National Health Reform Act 2011[17] as an independent portfolio agency to monitor and report on healthcare system performance; it has since been merged with the Australian Institute of Health and Welfare. NHPA commenced operations in 2012. As part of its Strategic Plan 2012–2015,[16] NHPA is required to regularly review its Performance and Accountability Framework (PAF) to ensure it remains relevant and continues to address the needs of the Australian public for high-quality healthcare. The PAF consists of 48 national indicators: 31 indicators for Medicare Locals now called Primary Health Care Networks; (geographically based primary care co-ordinating agencies) and 17 indicators for performance of Local Hospital Networks and hospitals[18] (see Table 1). The Canadian framework has two main goals: to determine (1) the health of Canadians and (2) how well the health system performs and operates according to the published principles of providing reports that respect Canadians’ privacy and are also consistent, relevant, flexible, integrated, user-friendly and accessible.[19,20] The indicator framework is conceptualised in terms of the provision of high-quality comparative information across four dimensions. Within these, eight domains of health system performance are defined.[19,21] The Danish do not have a formal framework, but one is currently being developed. Instead, as a proxy framework, the Danish Institute for Quality and Accreditation in Healthcare (IKAS) manages the Danish Healthcare Quality Program (DDKM) as a national accreditation and standard–based programme. At the time of the study, this provides advanced indicators and applies them throughout the country. These standards are overseen by the International Society for Quality in Healthcare (ISQua).[22] The Danish National Indicator Project (DNIP) merged with the Clinical Quality Development Programme (RKKP) in 2010.[8] Although Denmark lacks a formal framework, the DNIP manual outlines the thinking behind its clinical indicators and planned future indicators (1) to improve prevention, diagnostics, treatment and rehabilitation; (2) to provide documentation for setting priorities and (3) to create an information resource for patients and consumers. An example of a framework which operates at multiple geographical levels is that used in England’s National Health Service (NHS). This comprises three performance frameworks: the NHS Outcomes Framework, which focuses on performance and accountability; the Clinical Commissioning Group (CCG) Outcomes Indicator Set aimed at helping the CCG in planning and benchmarking and providing information to consumers and the Quality and Outcomes Framework (QOF) which is a voluntary pay for performance programme for general practice in England.[23] The Dutch framework, by comparison, is relatively streamlined and more consumer-focussed. Representatives of the Netherlands’ Ministry of Health, Welfare and Sport collaborated with academic researchers to develop the conceptual framework after reviewing the strategic goals of the healthcare system, information needs of policymakers and studying existing theory and international experiences.[24] The resulting framework divides healthcare into four specific community needs: (1) staying healthy, (2) getting better, (3) living independently with a chronic illness and (4) end-of-life care.[25] New Zealand has included an atlas of healthcare variation as one of its four health system performance monitoring mechanisms,[26-29] and other countries such as the United States and the United Kingdom utilise an atlas, too. The atlas is organised according to clinical areas: maternity, demography, cardiovascular disease, gout, polypharmacy and surgical procedures.[29] In 2013, the NZ Health Quality and Safety Commission’s campaign Open for Better Care was commenced to measure whether planned changes in practice occur and whether they have resulted in reduced costs and harms.[28] Scotland conceptualised its Quality Measurement Framework on three levels to structure and coordinate the range of measurements that are taken across NHS Scotland. For monitoring long-term progress, there are 12 Quality Outcome Indicators (QOIs) which do not have specific targets; short-term priority areas are focussed on by the ‘HEAT’ targets: Health improvement for the people of Scotland (H), Efficiency and governance improvements (E), Access to services (A) and Treatment appropriate to individuals (T); and the third category includes all other national and local reporting requirements. In the United States, three identifiable entities report on healthcare performance. One reports nationally (the US Department of Health and Human Services’ Agency for Healthcare Research and Quality (AHRQ)), one internationally (The Commonwealth Fund) and one locally (Hospital Compare). While there is no single integrated framework, the AHRQ measures health system performance across four dimensions[13] and the Commonwealth Fund aims to be a catalyst for change by identifying promising practices to help the United States create a high-performing healthcare system.[14] The Commonwealth Fund spans four dimensions of health system performance: access, including insurance and affordability of care; prevention and treatment, including quality of ambulatory, acute, post-acute and palliative care; avoidable hospital use and cost, such as care that could have been avoided if the patient received appropriate care initially; and indicators assessing the extent to which people can enjoy long and healthy lives.[15] Of the 11 frameworks published in five countries and the OECD, the most frequently used (self-reported) domains were effectiveness (eight), access and safety (seven each) and efficiency (five; Table 2). There are likely to be considerable overlap between the definitions of some of the domains such as effectiveness and appropriateness. For example, the OECD considers these two domains as separate while the Australian framework considers appropriateness as a subset of effectiveness.[6,18] Because of this, and hierarchical relationships between domains within some frameworks, it is difficult to report the number of indicators used against each domain for each country.

Indicator choice

The search in eight countries found 401 indicators that fulfilled the ‘nationally consistent and locally relevant’ criteria we applied. Of these, 45 indicators are reported in more than one country. Table 3 contains a breakdown of indicators by country.
Table 3.

Nationally consistent and locally relevant indicators by country.

Country – primary source for an indicatorN
England111
Canada86
United States94
Denmark68
Australia56
New Zealand33
Scotland24
The Netherlands15
Nationally consistent and locally relevant indicators by country. The search yielded 219 community-level, 231 hospital-level and 37 population-level indicators. Some indicators were classified into more than one category (Table 4).
Table 4.

Number of international indicators by domain and community, hospital and population.

DomainCommunityHospitalPopulationTotal
Access4145086
Patient experience2521147
Safety and quality1461452293
Efficiency211013
Population health outcomes593448
Total21923137N/A
Number of international indicators by domain and community, hospital and population. We classified the indicators, where possible, into major disease groups (Table 5). Cardiovascular, surgery and mental health were the most frequently reported disease groups. Indicators tend to be more specifically linked to a clinical condition or disease group in some countries such as Denmark.[30]
Table 5.

Nationally consistent and locally relevant indicators by disease group.

Major clinical groupingN%[a]
Cardiovascular disease6215
Surgery4511
Mental health4210
Cancer266
Endocrine disease215
Respiratory disease205
Musculoskeletal174
Maternal and child health174
Emergency113
Radiology61
Chronic kidney disease51
Neurological disease41

Denominator = 401.

Nationally consistent and locally relevant indicators by disease group. Denominator = 401.

Review processes

Regular review of the performance framework and indicators is conducted in most of the eight countries by government or government-funded, arm’s-length bodies. For example, the Canadian framework has continually developed and evolved since its inception, as a result of collaboration from a dedicated group including the Canadian Institute for Health Information (CIHI), Statistics Canada (SC), Health Canada (HC) and representatives from other stakeholder groups.[19] Similarly, the National Institute for Health and Clinical Excellence (NICE) has a key role in indicator development in England. NICE is responsible for managing the development process of clinical and public health indicators for the QOF and the CCG indicator sets.[31] NICE also recommends whether the existing indicators should continue to be part of the frameworks. NICE has developed guides,[32,33] which set out in detail the processes involved in managing the development of indicators. Thus, indicators tend to be developed in a relatively open and transparent way, with input from individuals and stakeholder organisations. Of course, this statement masks the contested and political components of indicator development and use, which does not figure explicitly in policy documentation, academic articles or this review and is mostly country-specific.[34]

Reporting

The timing and mechanism of reporting on indicators were not consistent between countries, nor were they always internally consistent. This can be seen in the reporting on Canada’s health system performance, where various indicators are reported via multiple channels. There were 101 performance indicators listed on the SC website.[35] The CIHI also has indicators listed under the Canadian Hospital Reporting Project (CHRP). While some of the indicators are the same as those listed by SC, there are some additional hospital performance indicators (21 in total).[36] Additionally, the Government of Canada[37] has a candidate list of 70 indicators that were approved for use by Federal, Provincial and Territorial Ministers of Health in 2004. However, it is difficult to gauge how many indicators are in use, because only certain indicators are selected for inclusion in the annual reports and there appear to be various degrees of overlap. For example, the Health Indicators 2013 report[20] provides results for 88 indicators, 13 of which were additional indicators chosen to be included to measure performance by socioeconomic status at provincial and national levels.[20] Although this appears confusing from an external perspective, variable reporting may be more effective in some instances, as CIHI addresses reporting needs by acknowledging different audiences and tailoring reporting for their requirements. Meanwhile, in the Netherlands, a report detailing results for all 125 indicators is published every 2 years.[38] From 2011, the information was updated via a website twice a year and from 2014, the report will be published every 4 years.[38] The indicators are reported at the national level, not locally. Indicators reported locally (at regional levels) occur via the Dutch Hospital Database. There are two dedicated websites that provide consumers of healthcare with information about the quality of a service and provide ratings for their service.[39] These are Independer – www.independer.nl – and Kiesbeter – www.kiesbeter.nl or ‘Choose Better’. Similarly, in Denmark, clinical indicators are reported, and a structured audit process is initiated every 6 months by audit groups of clinicians at national, regional and local levels to explain the risk-adjusted results and to plan improvements. After the audit process is complete, the data are released publicly, including comments on the results from the audit groups.[40] Reports on many of the indicators are available on the www.sundhed.dk website.

Discussion

In this study, the eight countries selected for review were using indicators and had implemented a performance framework, several over more than a decade. The progress they have made, and choices taken in selecting and using indicators, might be of value for other health systems in contemplating the development of their indicators or frameworks, or modifying their performance mechanisms. A key finding was the widespread support for implementing a healthcare system performance framework. The importance of a logical, universally acceptable and viable conceptual framework to underpin development of a national performance indicator set is also emphasised in the literature.[41,42] A conceptual framework sets out the rationale and design principles for the indicator set and links it to the wider health system context. It seeks to answer the question ‘performance of what – and to what ends?’[6] Reasons given for developing such a framework are as follows: (1) to define the scope and dimensions of measurement;[24,6] (2) to help align the performance measurement system with other policy priorities and financial exigencies;[43] (3) to provide a clearly defined vision to encourage buy-in by clinicians and consumers[24,43] and (4) by involving potential end-users of the data in the design of the framework, to ensure its future usefulness.[44] A conceptual framework encompassing multiple domains and with balanced representation across structure, process and outcome indicators is considered to be a key element of health reform over time.[45] Although we presented the self-reported domains by country, consistency of definitions between countries and the level of semantic overlap was not tested; however, these are likely to be substantial. For example in the Australian PAF, the domain appropriateness is subordinate to, or sub-class of, the domain effectiveness.[18] In Canada, these two domains are not grouped, but are classified as separate concepts. Definitions for these domains are often not explicit in the policy documents.[20] Definitional consistency between countries should be the subject of more research or efforts to internationally standardise. Although there is a substantial literature dealing with the design, properties and scientific soundness of individual indicators, there is considerably less attention given to how indicators are used in practice and the impact they may have on the behaviour of health professionals, or on the quality of care. While there is no answer to questions such as how many indicators, which domains should be targeted or what should be the right mix of indicators, there is a fundamental debate centred on whether the purpose of performance indicators is accountability or quality improvement.[41-43,46] Internationally, there is a split between those countries which emphasise public reporting and accountability (e.g. the UK NHS’s ‘star-ratings’ system of 2001–2005)[47] and those that use results for non-publicised feedback to organisations to stimulate improvement. It is broadly agreed that monitoring performance imposes an inherent pressure on healthcare organisations or services to improve practice.[48] However, the extent to which this is accomplished is disputed and under-researched. The paucity of research examining the links between indicators and improvement may be due to the difficulty in attributing change to any particular policy initiative or intervention.[49] The literature supports the use of performance indicators, suggesting that their impact is more likely to be on provider rather than consumer behaviour.[43,50] However, there is a general call for more good quality studies on impact.[50-52] England and Canada do the most extensive research and development work to select indicators. The role of NICE in England exemplifies a thorough and considered approach to continued indicator development. The process of review, seen especially in England, Canada and Australia, is critical to the continued development of performance indicators and their use.

Conclusion

A large amount of comparative information about international performance indicators is now available.[53] We examined the systems in use in eight countries. Assessing commonalities and differences between indicator specification and application in comparable health systems may be of value to regulators, policymakers, researchers and clinicians and forms a foundation for further research into the practical impact of indicators on the quality of healthcare.
  18 in total

1.  Measuring quality of care with routine data: avoiding confusion between performance indicators and health outcomes.

Authors:  A Giuffrida; H Gravelle; M Roland
Journal:  BMJ       Date:  1999-07-10

Review 2.  Clinical indicators in accreditation: an effective stimulus to improve patient care.

Authors:  B T Collopy
Journal:  Int J Qual Health Care       Date:  2000-06       Impact factor: 2.038

Review 3.  Conceptual frameworks for health systems performance: a quest for effectiveness, quality, and improvement.

Authors:  O A Arah; N S Klazinga; D M J Delnoij; A H A ten Asbroek; T Custers
Journal:  Int J Qual Health Care       Date:  2003-10       Impact factor: 2.038

4.  Connections between quality measurement and improvement.

Authors:  Donald M Berwick; Brent James; Molly Joel Coye
Journal:  Med Care       Date:  2003-01       Impact factor: 2.983

5.  Developing a national performance indicator framework for the Dutch health system.

Authors:  A H A ten Asbroek; O A Arah; J Geelhoed; T Custers; D M Delnoij; N S Klazinga
Journal:  Int J Qual Health Care       Date:  2004-04       Impact factor: 2.038

6.  A conceptual framework for the OECD Health Care Quality Indicators Project.

Authors:  Onyebuchi A Arah; Gert P Westert; Jeremy Hurst; Niek S Klazinga
Journal:  Int J Qual Health Care       Date:  2006-09       Impact factor: 2.038

7.  Performance measurement in healthcare: part II--state of the science findings by stage of the performance measurement process.

Authors:  Carol E Adair; Elizabeth Simpson; Ann L Casebeer; Judith M Birdsell; Katharine A Hayden; Steven Lewis
Journal:  Healthc Policy       Date:  2006-07

8.  Methods to stimulate national and sub-national benchmarking through international health system performance comparisons: a Canadian approach.

Authors:  Jeremy Veillard; Alexandra Moses McKeag; Brenda Tipper; Olga Krylova; Ben Reason
Journal:  Health Policy       Date:  2013-04-16       Impact factor: 2.980

Review 9.  Does providing feedback on patient-reported outcomes to healthcare professionals result in better outcomes for patients? A systematic review.

Authors:  Maria B Boyce; John P Browne
Journal:  Qual Life Res       Date:  2013-03-17       Impact factor: 4.147

10.  Correlates of health and healthcare performance: applying the Canadian Health Indicators Framework at the provincial-territorial level.

Authors:  O A Arah; G P Westert
Journal:  BMC Health Serv Res       Date:  2005-12-01       Impact factor: 2.655

View more
  28 in total

1.  European Union state of health from 1990 to 2017: time trends and its enlargements' effects.

Authors:  João Vasco Santos; Mariana Lobo; Rui Manuel Neiva; João Viana; Júlio Souza; Cláudia Camila Dias; Jonathan Cylus; Walter Ricciardi; Alberto Freitas
Journal:  Int J Public Health       Date:  2020-02-17       Impact factor: 3.380

2.  Healthcare Service Interventions to Improve the Healthcare Outcomes of Hospitalised Patients with Extreme Obesity: Protocol for an Evidence and Gap Map.

Authors:  Caz Hales; Rebecca Chrystall; Anne M Haase; Mona Jeffreys
Journal:  Methods Protoc       Date:  2022-06-08

Review 3.  Performance Improvement in Head and Neck Cancer.

Authors:  Carol M Lewis; Randal S Weber
Journal:  Curr Oncol Rep       Date:  2018-01-19       Impact factor: 5.075

4.  Variability in the assessment of children's primary healthcare in 30 European countries.

Authors:  Daniela Luzi; Ilaria Rocco; Oscar Tamburis; Barbara Corso; Nadia Minicuci; Fabrizio Pecoraro
Journal:  Int J Qual Health Care       Date:  2021-02-08       Impact factor: 2.038

5.  Comparison of Crude Population-Level Indicators of Opioid Use and Related Harm in New Zealand and Ontario (Canada).

Authors:  Benedikt Fischer; Dimitri Daldegan-Bueno; Wayne Jones
Journal:  Pain Ther       Date:  2020-12-31

Review 6.  A review of the Australian healthcare system: A policy perspective.

Authors:  Sunil K Dixit; Murali Sambasivan
Journal:  SAGE Open Med       Date:  2018-04-12

7.  Comparing the old to the new: A comparison of similarities and differences of the accreditation standards of the chiropractic council on education-international from 2010 to 2016.

Authors:  Stanley I Innes; Charlotte Leboeuf-Yde; Bruce F Walker
Journal:  Chiropr Man Therap       Date:  2018-08-15

8.  Chiropractic student choices in relation to indications, non-indications and contra-indications of continued care.

Authors:  Stanley I Innes; Charlotte Leboeuf-Yde; Bruce F Walker
Journal:  Chiropr Man Therap       Date:  2018-01-23

Review 9.  The accreditation role of Councils on Chiropractic Education as part of the profession's journey from craft to allied health profession: a commentary.

Authors:  Stanley I Innes; Charlotte Leboeuf-Yde; Bruce F Walker
Journal:  Chiropr Man Therap       Date:  2020-07-22

10.  How to measure cultural competence when evaluating patient-centred care: a scoping review.

Authors:  Sadia Ahmed; Fartoon M Siad; Kimberly Manalili; Diane L Lorenzetti; Tiffany Barbosa; Vic Lantion; Mingshan Lu; Hude Quan; Maria-Jose Santana
Journal:  BMJ Open       Date:  2018-07-17       Impact factor: 2.692

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.