| Literature DB >> 28372574 |
A Wind1,2, W H van Harten3,4,5.
Abstract
BACKGROUND: Although benchmarking may improve hospital processes, research on this subject is limited. The aim of this study was to provide an overview of publications on benchmarking in specialty hospitals and a description of study characteristics.Entities:
Keywords: Benchmarking; Quality improvement; Specialty hospitals
Mesh:
Year: 2017 PMID: 28372574 PMCID: PMC5379508 DOI: 10.1186/s12913-017-2154-y
Source DB: PubMed Journal: BMC Health Serv Res ISSN: 1472-6963 Impact factor: 2.655
Fig. 1Research design
Classification scheme for benchmarking by Fong et al. [16]
| Classification | Type | Meaning |
|---|---|---|
| Nature of benchmarking partner | Internal | Comparing within one organization about the performance of similar business units or processes |
| Competitor | Comparing with direct competitors, catch up or even surpass their overall performance | |
| Industry | Comparing with company in the same industry, including noncompetitors | |
| Generic | Comparing with an organization which extends beyond industry boundaries | |
| Global | Comparing with an organization where its geographical location extends beyond country | |
| Content of benchmarking | Process | Pertaining to discrete work processes and operating systems |
| Functional | Application of the process benchmarking that compares particular business functions at two or more organizations | |
| Performance | Concerning outcome characteristics, quantifiable in terms of price, speed, reliability, etc. | |
| Strategic | Involving assessment of strategic rather than operational matters | |
| Purpose for the relationship | Competitive | Comparison for gaining superiority over others |
| Collaborative | Comparison for developing a learning atmosphere and sharing of knowledge |
Fig. 2Article selection process
Charting categories and associated content for the general information on the benchmarking studies
| First author (Year) | Aim | Area of practice |
|---|---|---|
| Brucker (2008) [ | Establish a nationwide network of breast centres; to define suitable quality indicators (QIs) for benchmarking the quality of breast cancer (BC) care; to demonstrate existing differences in BC care quality; and to show that BC care quality improved with benchmarking from 2003 to 2007. | Breast cancer centers Germany |
| Chung (2010) [ | Developing organization-based core measures for colorectal cancer patient care and apply these measures to compare hospital performance. | Hospitals registered in the TCDB program in Taiwan |
| Hermann (2006) [ | To identify quality measures for international benchmarking of mental healthcare that assess important processes and outcomes of care, are scientifically sound, and are feasible to construct from pre-existing data. | Mental health care professionals from six countries (UK, Sweden, Canada, Australia, Denmark, and the USA) and one international organization, the European Society for Quality in Healthcare (ESQH) |
| Mainz (2009) [ | Describing and analyzing the quality of care for important diseases in the Nordic countries (Denmark, Finland, Greenland, Iceland, Norway and Sweden). | Cancer treatment facilities from the different Nordic countries (Denmark, Finland, Greenland, Iceland, Norway and Sweden) |
| Miransky (2003) [ | Describing the development of a database for benchmarking outcomes for cancer patients. | A consortium of 12 Comprehensive Cancer Centers in the US |
| Roberts (2012) [ | The study had three main aims, to: (i) adapt the acuity-quality workforce planning method used extensively in the UK National Health Service (NHS) for use in hospices; (ii) compare hospice and NHS palliative care staffing establishments and their implications; and (iii) create ward staffing benchmarks and formulae for hospice managers. | Twenty-three palliative care and hospice wards, geographically representing England. |
| Setoguchi (2008) [ | Comparing prospectively and retrospectively defined benchmarks for the quality of end-of-life care, including a novel indicator for the use of opiate analgesia. | Seniors with breast, colorectal, lung, or prostate cancer who participated in state pharmaceutical benefit programs in New Jersey and Pennsylvania |
| Stewart (2007) [ | Develop tools that lead to better-informed decision making regarding practice management and physician deployment in comprehensive cancer centers and determine benchmarks of productivity using RVUs (Relative value units) accrued by physicians at each institution. | 13 major academic cancer institutions with membership or shared membership in the National Comprehensive Cancer Network (NCCN) |
| Stolar (2010) [ | Performing a blinded confidential financial performance survey of similar university pediatric surgery sections to start benchmarking performance and define relationships. | 19 pediatric surgery sections of university children’s hospitals |
| Van Vliet (2010) [ | Comparing process designs of three high-volume cataract pathways in a lean thinking framework and to explore how efficiency in terms of lead times, hospital visits and costs is related to process design. | Three eye hospitals in the UK, the USA and the Netherlands |
| Wallwiener (2011) [ | Summarize the rationale for the creation of breast centres and discus the studies conducted in Germany. To obtain proof of principle for a voluntary, external benchmarking programme and proof of concept for third-party dual certification of breast centres and their mandatory quality management systems. | Breast centers in Germany |
| Wesselman (2014) [ | Present data from the third annual analysis of the DKG-certified colorectal cancer centers with a particular focus on indicators for colorectal cancer surgery. | Colorectal cancer centers certified by the German Cancer Society (DKG) |
| Barr (2012) [ | Revision of 2011 predictions with the use of National Practice Benchmark (NPB) reports from 2011 and development of new predictions. Design of a conceptual framework for contemplating these data based on an ecological model of the oncology delivery system. | Oncology practices in the USA |
| Brann (2011) [ | The performance of child and adolescent mental health organizations. To provide an overview of the findings from two projects, undertaken to explore the variability in organizations’ performances on particular KPIs (key performance indicators). | Six child and adolescent mental health organizations |
| De Korne (2010) [ | The purpose of this study was to evaluate the applicability of an international benchmarking initiative in eye hospitals. | Nine eye hospitals spread over Asia (3), Australia (1), Europe (4), and North America (1). |
| De Korne (2012) [ | The aims of this study were to assess the applicability of a benchmarking project in U.S. eye hospitals and compare the results with an international initiative. | Five eye hospitals in the US |
| Schwappach (2003) [ | Assess the effects of uniform indicator measurement and group benchmarking. This was followed by hospital-specific activities on clinical performance measures and patients’ experiences with emergency care in Switzerland. | Emergency departments of 12 community hospitals in Switzerland, participating in the ‘Emerge’ project. |
| Shaw (2003) [ | To answer basic questions, using precise definitions, regarding emergency department (ED) utilization, wait times, services, and attending physician staffing of representative pediatric EDs (PEDs). | 21 Pediatric emergency departments (PED) from 14 states of the USA. |
| Van Lent (2010) [ | Examine benchmarking as part of an approach to improve performance in specialty hospitals | International comprehensive cancer centres (CCC) or departments within a CCC in Europe and the US |
| Ellershaw (2008) [ | To evaluate the utility of participating in two benchmarking exercises to assess the care delivered to patients in the dying phase using the Liverpool Care Pathway for the Dying Patient (LCP). | Two cancer networks in the northwest of England |
| Ellis (2006) [ | Review published descriptions of benchmarking activity and synthesize benchmarking principles to encourage the acceptance and use of Essence of Care as a new approach to continuous quality improvement, and to promote its acceptance as an integral and effective part of benchmarking activity in health services. | NHS (UK) |
| Matykiewicz (2005) [ | Introduce Essence of Care, a benchmarking tool for health care practitioners and an integral part of the UK National Health Service (NHS) Clinical Governance agenda | Health care practitioners NHS (UK) |
| Profit (2010) [ | To present a conceptual framework to develop comprehensive, robust, and transparent composite indicators of pediatric care quality, and to highlight aspects specific to quality measurement in children. | The Pediatric Data Quality Systems (Pedi-QS) Collaborative Measures Workgroup (consensus panel by the National Association of Children’s Hospitals and Related Institutions, Child Health Corporation of America, and Medical Management Planning) |
| Greene (2009) [ | Describing The Role of the Hospital Registry in Achieving Outcome Benchmarks in Cancer Care | Carolinas Medical Center (US) |
Fig. 3Number of publications per category and area of practice
Summary of the analysis of the pathway benchmarking projects
| Author | Study design | Benchmarking model and/or steps | Indicators | Outcomes | Impact (improvements/improvement suggestions) | Success factors |
|---|---|---|---|---|---|---|
| Brucker [ | Prospective interventional multi-centre feasibility study. | Partner: Industry | Quality outcome indicators derived from clinically relevant parameters. | The results from this study provide proof of concept for the feasibility of a novel, voluntary, nationwide system for benchmarking the quality of BC care | Marked QI (Quality Indicators) increases indicate improved quality of BC care. | The project was voluntary and all data was anonymized. |
| Chung [ | Multi comparisons study and the development of core measures for colorectal cancer including a modified Delphi method. | N.A. | Quantitative structure, process and outcome indicators | Developing core measures for cancer care was a first step to achieving standardized measures for external monitoring, as well as for providing feedback and serving as benchmarks for cancer care quality improvement. | N.A. | N.A. |
| Hermann [ | Multi comparisons study and indicator consensus development process (with elements of the Delphi method). | Partner: Industry/Global | Process and outcome indicators. | The bench mark was not performed, indicators were developed for a possible benchmark. | N.A. | N.A. |
| Mainz [ | Multi comparisons study and the development of indicators based on consensus of a working group | N.A. The results that are available for the prioritized quality indicators cannot really be used for true comparisons and benchmarking | Outcome indicators | A major difference between the Nordic countries has been identified with regard for 5 years survival for prostate cancer. | N.A. | N.A. |
| Miransky [ | Multi comparisons study with stakeholder consensus methods. Use of a specialized database for benchmarking outcomes for cancer patients. Conference calls and joint meetings between comprehensive cancer centers and possible benchmark vendors were used to develop this benchmarking database. | Partner: Industry | Development of a database containing outcome indicators. Benchmarking clinical outcomes and patient | The various databases developed by the collaborative provided the tools through which the group accomplished its goals. | Each consortium member is expected to participate in one quality improvement initiative annually | N.A. |
| Roberts [ | Multi comparisons study on staffing and inpatient data at hospices. Study design drew extensively from a UK-wide nursing study (The UK Best practice Nursing Database). | N.A. | Mixture of indicators, both qualitative and quantitative and process and outcome indicators | A broader NHS ward data system, was successfully converted for hospice use. The resultant hospice and palliative care ward data show that, compared to NHS palliative care wards, charitable hospices: (i) look after fewer patients, but generate greater workloads owing to higher patient-dependency and acuity scores; (ii) are much better staffed; and (iii) achieve higher service-quality scores. | N.A. | N.A. |
| Setoguchi [ | Retrospective and prospective cohort study. | Partner: Industry | Outcome indicators | Retrospective and prospective measures, including a new measure of the use of opiate analgesia, identified similar physician and hospital patterns of end-of-life care. | Findings suggest that the use of opiates at the end of life can be improved | N.A. |
| Stewart [ | Multi comparisons study (clinical productivity and other characteristics of oncology physicians). Data collection by survey | Partner: Industry | Outcome productivity indicators | Specific clinical productivity targets for academic oncologists were identified. A methodology for analyzing potential factors associated with clinical productivity and developing clinical productivity targets specific for physicians with a mix of research, administrative, teaching, and clinical salary support. | N.A. | N.A. |
| Stolar [ | Multi comparisons study using a non-searchable anonymous data capture form through SurveyMonkey. Feedback from stakeholders and availability of information was used to develop indicators. A final questionnaire, containing 17 questions, was send to thirty pediatric surgery practices. | N.A. | Quantitative outcome indicators | A review of the clinical revenue performance of the practice illustrates that pediatric surgeons are unable to generate sufficient direct financial resources to support their employment and practice operational expenses. | The value of the services must accrue to a second party | N.A. |
| Van Vliet [ | A retrospective comparative benchmark study with a mixed-method design | Partner: Industry/Global | N/A | The environmental context and operational focus primarily influenced process design of the cataract pathways. | When pressed to further optimize their processes, hospitals can use these systematic benchmarking data to decrease the frequency of hospital visits, lead times and costs. | N.A. |
| Wallwiener [ | Review of existing literature/data. | Partner: Industry | Structural and process indicators | The voluntary benchmarking programme has gained wide acceptance among DKG/DGS-certified breast centres. The goal of establishing a nationwide network of certified breast centres in Germany can be considered largely achieved. | Improvements in surrogate parameters as represented by structural and process quality indicators suggest that outcome quality is improving. | N.A. |
| Wesselman [ | Review of existing literature/data. Analysis of existing benchmarking reports of cancer centers. | Partner: Industry | Respective and guideline-based outcome indicators | The present analysis of the results, together with the centers’ statements and the auditors’ reports, shows that most of the targets for indicator figures are being better met over the course of time. | There is a clear potential for improvement and the centers are verifiably addressing this. | N.A. |
N.A. = Not applicable
Summary of the analysis of the institutional benchmarking projects. N.A. = Not applicable
| Author | Study design | Benchmarking model and/or steps | Indicators | Outcome | Impact (improvements/improvement suggestions) | Success factors |
|---|---|---|---|---|---|---|
| Barr [ | Multi comparisons study using the National Practice Benchmark. | Partner: Industry | N.A. | The National Practice Benchmark reveals a process of change that is reasonably orderly and predictable, and demonstrates that the adaptation of the oncology community is directional, moving toward gains in efficiency as assessed by a variety of measures. | N.A. | To make the survey more accessible, it was stratified into 2 sections (minimum data set and extra). |
| Brann [ | Multi comparisons study in which representatives from child and adolescent mental health organizations used eight benchmarking forums to compare performance against relevant KPIs. | N.A. | Key performance indicators looking at outcomes in mental health | Benchmarking has the potential to illuminate intra- and inter-organizational performance. | N.A. | 1. Commitment of the management and securing resources. 2. Feeding back benchmarking data to data interpretation clinical staff to maintain their motivation to the project. 3. Forums for participants to provide them with the opportunity to discuss the performance of their organisation and draw lessons from other organisations. |
| De Korne [ | Mixture of methods: a systematic literature review and semi-structured interviews. An evaluation frame (based on a systematic literature review) was applied longitudinally to a case study of nine eye hospitals that used a set of performance indicators for benchmarking. | Partner: Industry/Global | Performance outcome indicators | The benchmarking indicators were mostly used to initiate and to facilitate discussions about management strategies. The eye hospitals in this study were not successful in reaching the goal of quantifying performance gaps or identifying best practices. | Indicators for benchmarking were not incorporated in a performance management system in any of the hospitals, nor were results discussed with or among employees; only the strategic level was involved. | Performance indicators should; |
| De Korne [ | Mixture of methods: quantitative analysis included (i) analysis of fiscal year 2009 benchmarking performance data and (ii) evaluation of multiple cases by applying an evaluation frame abstracted from the literature to five U.S. eye hospitals that used a set of 10 indicators for efficiency benchmarking. Qualitative analysis of interviews, document analyses, and questionnaires. | Partner: Industry | Efficiency outcome indicators | The benchmark initiative fulfilled many of its purposes, namely, identifying performance gaps, implementing best practices, and stimulating exchange of knowledge. | Case studies showed that, to realize long-term efforts, broader cooperation is necessary. | 1. the 4P model suggests that reliable and comparable indicators are a precondition for a successful benchmark, 2. case studies suggest that the development process is an important part of benchmarking. 3. homogeneity in language, reimbursement systems, and administrations |
| Schwappach [ | Prospective and retrospective mixed methods: | Partner: Industry | Outcome Indicator set including two main components: objective measures that evaluate clinical performance in terms of speed and accuracy of patient assessment, and patients’ experiences with care provided by Eds. | Concordance of prospective and retrospective assignments to one of three urgency categories improved significantly by 1%, and both under- and over-prioritization, were reduced. Significant improvements in the reports provided by patients were achieved and were mainly demonstrated in structures of care provision and perceived humanity. | A number of improvement activities were initiated in individual hospitals covering a wide range of targets, from investment in ED structures to professional education and organization of care. | Interpretation of results should be guided by a culture of organisational learning rather than individual blame. |
| Shaw [ | Multi comparisons study with the use of questionnaire containing ten questions. | N.A. | 10 ‘questions’ regarding ED patient utilization, wait times, services, and attending physician staffing of the nation’s | Benchmarking of PEM staffing and performance indicators by PEM directors yields important administrative data. PEDs have higher census and admission rates compared with information from all EDs, while their attending staffing, wait times, and rate of patients who leave without being seen are comparable to those of general EDs. | In larger departments, the opening of fast tracks during high census times has allowed for shorter disposition of lower acuity patients with good success, this has been recommended as one of the solutions to better ED throughput. | N.A. |
| Van Lent [ | Multi comparisons study internationally benchmarking operations management in cancer centres. | Partner: Industry/Global | Outcome indicators containing a numerator and a de-numerator The selected indicators distinguished between the total organization level, diagnostics, surgery, medication related treatments, radiotherapy and research. | The results on the feasibility of benchmarking as a tool to improved hospital processes are mixed. Success factors identified are a well-defined and small project scope, partner selection based on clear criteria, stakeholder involvement, simple and well-structured indicators, and analysis of both the process and its results. | All multiple case studies provided areas for improvement and one case study presented the results of a successful improvement project based on international benchmarking. | 1. Internal stakeholders must be convinced that others might have developed solutions for problems that can be translated to their own settings. 2. Management must reserve sufficient resources for the total benchmarks. 3. Limit the scope to a well-defined problem. 4. Define criteria to verify the comparability of benchmarking partners based on subjects and process. 5. Construct a format that enables a structured comparison. 6. Use both quantitative and qualitative data for measurement. 7. Involve stakeholders to gain consensus about the indicators.8. Keep indicators simple so that enough time can be spent on the analysis of the underlying processes. 9. For indicators showing a large annual variation in outcomes, measurement over a number of years should be considered. 10. Adapt the identified better working methods so that they comply with other practices in the organisation. |
Summary of the analysis of the benchmarking evaluation/methodology studies
| Author | Study design | Benchmarking model and/or steps | Indicators | Outcome | Impact (improvements/improvement suggestions) | Success factors |
|---|---|---|---|---|---|---|
| Ellershaw [ | Survey to assess the usefulness of benchmarking with the Liverpool Care Pathway | Partner: Industry | N.A. | Whilst almost three quarters of the respondents in the hospital sector felt that participation in the benchmark had had a direct impact on the delivery of care, only around a third in the other two sectors felt the same (hospice and community). | Specific improvements in levels of communication between health professionals and relatives, within multidisciplinary teams and across sectors occurred as a result of participation in the benchmarking exercise. | Holding a workshop for participants to reflect on data, enhances understanding and learn from others. |
| Ellis [ | Literature review to encourage the acceptance and use of Essence of Care as a new benchmarking approach. | Partner: Industry | N.A. | Essence of Care benchmarking is a sophisticated clinical practice benchmarking approach which needs to be accepted as an integral part of health service benchmarking activity to support improvement in the quality of patient care and experiences. | N.A. | 1. Reciprocity |
| Matykiewicz [ | Case study approach and qualitative methods namely interviews and focus groups | Partner: Industry | Best practice indicators | Whilst raising awareness is relatively straightforward, putting Essence of Care into practice is more difficult to achieve, especially when happening at a time of significant organizational change. | Through self-assessment against the best practice indicators, a problem was identified which, if not dealt with, could have escalated to a more serious situation. The manager saw this as an opportunity to learn from mistakes and initiated a service review that has since resulted in the service being redesigned. | 1. Workshops (successful in raising awareness, help people to understand how to apply the benchmarking process in practice) |
| Profit [ | Literature review on composite indicator development, health systems, and quality measurement in the pediatric healthcare setting. | N.A. | No indicators were mentioned, however a conceptual framework to develop comprehensive, robust, and transparent composite indicators of pediatric care quality was developed. The model proposed identifying structural, process, and outcome metrics for each of the Institute of Medicine’s six domains of quality. | The combination of performance metric development methodology with Profit et al.’s quality matrix framework may result in a unique approach for quality measurement that is fair, scientifically sound, and promotes the all-important provider buy-in. The framework presented offers researchers a path to composite indicator development. | N.A. | N.A. |
N.A. not applicable
Summary of the analysis of Benchmark study using patient registry data
| Author | Study design | Benchmarking model and/or steps | Indicators | Outcome | Impact (improvements/improvement suggestions) | Success factors |
|---|---|---|---|---|---|---|
| Greene [ | Development of a cancer committee; review of the NCDB reports from the Electronic Quality Improvement | N.A. | Outcome indicators | In addition to a role in benchmarking, registry data may be used to assist in establishing new research protocols and in determining market share by the hospital administration. The registry identified several issues which included the lack of physician office contact information, and time lapse for treatment completion. | Two potential issues were identified. With instruction for the pathologists and surgeons regarding these issues, this rate is expected to improve. | N.A. |
N.A. not applicable
Success factors benchmarking projects specialty hospitals and pathways
| 1. Voluntary participation |
| 2. Anonymous participation |
| 3. Internal stakeholders must be convinced that others might have developed solutions for problems of the underlying processes that can be translated to their own settings. |
| 4. Verify homogeneity participant group to ensure the comparability of benchmarking partners |
| 5. Ensure commitment of the management and secure resources |
| 6. Limit the scope of the project to a well-defined problem |
| 7. Involve stakeholders to gain consensus about the indicators |
| 8. Develop indicators that are specific, measurable, acceptable, achievable, realistic, relevant, and timely (SMART) |
| 9. Use simple indicators so that enough time can be spent on the analysis |
| 10. Measure both qualitative and quantitative data |
| 11. Stratify survey into minimum data set and additional extra’s |
| 12. For indicators showing a large annual variation in outcomes, measurement over a number of years should be considered |
| 13. Feed benchmarking data back to clinical staff to maintain their motivation to the project |
| 14. Organize forums and workshops for participants to discuss performance of their organization and learn from other organizations |
| 15. Convert data into measurable quantities |
| 16. Homogeneity in language, reimbursement systems, and administrations |
| 17. Interpretation of results should be guided by a culture of organisational learning rather than individual blame. |