Literature DB >> 1465294

A method for screening the quality of hospital care using administrative data: preliminary validation results.

L I Iezzoni1, S M Foley, T Heeren, J Daley, C C Duncan, E S Fisher, J Hughes.   

Abstract

Applying a computerized algorithm to administrative data to help assess the quality of hospital care is intriguing. As Iezzoni and colleagues point out, there are major differences of opinion as to the worth of such efforts. This article significantly advances the state of the art in using administrative data to screen for potential quality-of-care problems. In addition, this work on identifying complications of care goes well beyond the emphasis of many government organizations on hospital mortality rates. One question, however, not raised in the paper is: What is a practical upper limit to the sensitivity and specificity in comparing computerized screen results with the consensus judgments of a group of independent physicians? Advanced statistical techniques (such as bootstrapping) might be used to estimate the stability of consensus judgments by physician groups. When the judgments of two groups of physicians are compared with each other, the resulting sensitivity and specificity will not be .99! In addition, more training of members of the physician panels would probably have increased interrater reliability. While acknowledging this problem, the researchers' detailed analysis of the panel results is intriguing and represents a model for such studies. It is hoped that the authors will follow up on the avenues opened here. Furthermore, what degree of accuracy is necessary to identify facilities with higher-than-expected rates of complications? The authors discuss problems involved in using administrative data to target hospitals and departments for more costly in-depth reviews of quality. It is hoped that the promising findings that are reported here will be validated in other studies. Certainly their algorithms should find a ready audience in insurers and hospitals willing to try them out. Finally, should we expect additional research to lead to improvement in the authors' algorithms? I believe the algorithms will prove difficult to improve upon; but perhaps we should not worry about this. At some point, however, the cost of trying to identify and correct quality problems in "minimally outlier" hospitals will exceed the benefits, particularly given alternative uses for the funds. Might we now be close the the "flat of the curve" in the development of such systems for identification of quality problems? This issue should be discussed much further in future studies.

Mesh:

Year:  1992        PMID: 1465294     DOI: 10.1016/s0097-5990(16)30557-7

Source DB:  PubMed          Journal:  QRB Qual Rev Bull        ISSN: 0097-5990


  20 in total

1.  Discrepancies between explicit and implicit review: physician and nurse assessments of complications and quality.

Authors:  Saul N Weingart; Roger B Davis; R Heather Palmer; Michael Cahalane; Mary Beth Hamel; Kenneth Mukamal; Russell S Phillips; Donald T Davies; Lisa I Iezzoni
Journal:  Health Serv Res       Date:  2002-04       Impact factor: 3.402

2.  Electronically screening discharge summaries for adverse medical events.

Authors:  Harvey J Murff; Alan J Forster; Josh F Peterson; Julie M Fiskio; Heather L Heiman; David W Bates
Journal:  J Am Med Inform Assoc       Date:  2003-03-28       Impact factor: 4.497

Review 3.  Detecting adverse events using information technology.

Authors:  David W Bates; R Scott Evans; Harvey Murff; Peter D Stetson; Lisa Pizziferri; George Hripcsak
Journal:  J Am Med Inform Assoc       Date:  2003 Mar-Apr       Impact factor: 4.497

Review 4.  Administrative data based patient safety research: a critical review.

Authors:  C Zhan; M R Miller
Journal:  Qual Saf Health Care       Date:  2003-12

5.  Using severity measures to predict the likelihood of death for pneumonia inpatients.

Authors:  L I Iezzoni; M Shwartz; A S Ash; Y D Mackiernan
Journal:  J Gen Intern Med       Date:  1996-01       Impact factor: 5.128

6.  [Clinical quality measurement in anaesthesia from routine data. Examples of appendectomy and resection of the colon].

Authors:  B Jüttner; K Stenger; G Heller; A Krause; C Günster; D Scheinichen
Journal:  Anaesthesist       Date:  2012-05       Impact factor: 1.041

Review 7.  [The validity of routine data on quality assurance: A qualitative systematic review].

Authors:  E Hanisch; T F Weigel; A Buia; H-P Bruch
Journal:  Chirurg       Date:  2016-01       Impact factor: 0.955

8.  Validating administrative records in post-traumatic stress disorder.

Authors:  Thad E Abrams; Mary Vaughan-Sarrazin; Terence M Keane; Kelly Richardson
Journal:  Int J Methods Psychiatr Res       Date:  2015-06-16       Impact factor: 4.035

9.  Measuring hospital inefficiency: the effects of controlling for quality and patient burden of illness.

Authors:  Ryan L Mutter; Michael D Rosko; Herbert S Wong
Journal:  Health Serv Res       Date:  2008-09-08       Impact factor: 3.402

10.  Measuring hospital quality: can medicare data substitute for all-payer data?

Authors:  Jack Needleman; Peter I Buerhaus; Soeren Mattke; Maureen Stewart; Katya Zelevinsky
Journal:  Health Serv Res       Date:  2003-12       Impact factor: 3.402

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.