Amelia M Bond1,2, Kevin G Volpp3,4,5, Ezekiel J Emanuel3,4,5, Kristen Caldarella5, Amanda Hodlofski6, Lee Sacks7, Pankaj Patel7, Kara Sokol7, Salvatore Vittore7, Don Calgano7, Carrie Nelson7, Kevin Weng7, Andrea Troxel8, Amol Navathe3,4,5. 1. Health Care Management, The Wharton School, University of Pennsylvania, Philadelphia, PA, USA. amb2036@med.cornell.edu. 2. Department of Healthcare Policy and Research, Weill Cornell Medicine, New York, NY, USA. amb2036@med.cornell.edu. 3. Leonard Davis Institute of Health Economics, The Wharton School, University of Pennsylvania, Philadelphia, PA, USA. 4. Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA. 5. Division of Health Policy, University of Pennsylvania, Philadelphia, PA, USA. 6. HealthCore, Inc., Wilmington, DE, USA. 7. Advocate Health System, Chicago, IL, USA. 8. Department of Population Health, New York University School of Medicine, New York, NY, USA.
Abstract
BACKGROUND: Pay-for-performance (P4P) has been used expansively to improve quality of care delivered by physicians. However, to what extent P4P works through the provision of information versus financial incentives is poorly understood. OBJECTIVE: To determine whether an increase in information feedback without changes to financial incentives resulted in improved physician performance within an existing P4P program. INTERVENTION/EXPOSURE: Implementation of a new registry enabling real-time feedback to physicians on quality measure performance. DESIGN: Observational, predictive piecewise model at the physician-measure level to examine whether registry introduction associated with performance changes. We used detailed physician quality measure data 3 years prior to registry implementation (2010-2012) and 2 years after implementation (2014-2015). We also linked physician-level data including age, gender, and board certification; group-level data including registry click rates; and patient panel data including chronic conditions. PARTICIPANTS: Four hundred thirty-four physicians continuously affiliated with Advocate from 2010 to 2015. MAIN MEASURES: Physician performance on ten quality metrics. KEY RESULTS: We found no consistent pattern of improvement associated with the availability of real-time information across ten measures. Relative to predicted performance without the registry, average performance increased for two measures (childhood immunization status-rotavirus (p < 0.001) and diabetes care-medical attention for nephropathy (p = 0.024)) and decreased for three measures (childhood immunization status-influenza (p < 0.001) and diabetes care-HbA1c testing (p < 0.001) and poor HbA1c control (p < 0.001)). Results were consistent for subgroup analysis on those most able to improve, i.e., physicians in the bottom tertile of performance prior to registry introduction. Physicians who improved most were in groups that accessed the registry more than those who improved least (8.0 vs 10.0 times per week, p = 0.010). CONCLUSIONS: More frequent provision of information, provided in real-time, was insufficient to improve physician performance in an existing P4P program with high baseline performance. Results suggest that electronic registries may not themselves drive performance improvement. Future work should consider testing information feedback enhancements with financial incentives.
BACKGROUND: Pay-for-performance (P4P) has been used expansively to improve quality of care delivered by physicians. However, to what extent P4P works through the provision of information versus financial incentives is poorly understood. OBJECTIVE: To determine whether an increase in information feedback without changes to financial incentives resulted in improved physician performance within an existing P4P program. INTERVENTION/EXPOSURE: Implementation of a new registry enabling real-time feedback to physicians on quality measure performance. DESIGN: Observational, predictive piecewise model at the physician-measure level to examine whether registry introduction associated with performance changes. We used detailed physician quality measure data 3 years prior to registry implementation (2010-2012) and 2 years after implementation (2014-2015). We also linked physician-level data including age, gender, and board certification; group-level data including registry click rates; and patient panel data including chronic conditions. PARTICIPANTS: Four hundred thirty-four physicians continuously affiliated with Advocate from 2010 to 2015. MAIN MEASURES: Physician performance on ten quality metrics. KEY RESULTS: We found no consistent pattern of improvement associated with the availability of real-time information across ten measures. Relative to predicted performance without the registry, average performance increased for two measures (childhood immunization status-rotavirus (p < 0.001) and diabetes care-medical attention for nephropathy (p = 0.024)) and decreased for three measures (childhood immunization status-influenza (p < 0.001) and diabetes care-HbA1c testing (p < 0.001) and poor HbA1c control (p < 0.001)). Results were consistent for subgroup analysis on those most able to improve, i.e., physicians in the bottom tertile of performance prior to registry introduction. Physicians who improved most were in groups that accessed the registry more than those who improved least (8.0 vs 10.0 times per week, p = 0.010). CONCLUSIONS: More frequent provision of information, provided in real-time, was insufficient to improve physician performance in an existing P4P program with high baseline performance. Results suggest that electronic registries may not themselves drive performance improvement. Future work should consider testing information feedback enhancements with financial incentives.
Entities:
Keywords:
evaluation; health information technology; performance measurement; physician behavior
Authors: Joachim Roski; Robert Jeddeloh; Larry An; Harry Lando; Peter Hannan; Carmen Hall; Shu-Hong Zhu Journal: Prev Med Date: 2003-03 Impact factor: 4.018
Authors: Stephen Campbell; David Reeves; Evangelos Kontopantelis; Elizabeth Middleton; Bonnie Sibbald; Martin Roland Journal: N Engl J Med Date: 2007-07-12 Impact factor: 91.245
Authors: Daniella Meeker; Mark W Friedberg; Tara K Knight; Jason N Doctor; Dina Zein; Nancy Cayasso-McIntosh; Noah J Goldstein; Craig R Fox; Jeffrey A Linder; Stephen D Persell; Stanley Dea; Paul Giboney; Hal F Yee Journal: J Gen Intern Med Date: 2021-09-09 Impact factor: 6.473