T K Hazlet1, T A Lee, P D Hansten, J R Horn. 1. Department of Pharmacy, School of Pharmacy, University of Washington, Seattle 98195-7630, USA. thazlet@u.washington.edu
Abstract
OBJECTIVE: To evaluate the performance of computerized drug-drug interaction (DDI) software in identifying clinically important drug-drug interactions. DESIGN: One-time performance test of computer systems using a standard set of prescriptions. SETTING: Community pharmacies or central corporate locations with pharmacy terminals identical to those used in actual pharmacies. PARTICIPANTS: Chain and health maintenance organization (HMO) pharmacies with seven or more practice sites in Washington State. A total of nine different DDI software programs were installed in 516 community pharmacies represented by these chains and HMOs. MAIN OUTCOME MEASURES: Sensitivity, specificity, and positive and negative predictive values of software in detecting 16 well-established DDIs contained within six fictitious patient profiles. RESULTS: The software systems failed to detect clinically relevant DDIs one-third of the time. Sensitivity of the software programs ranged from 0.44 to 0.88, with 1.00 being perfect; specificity ranged from 0.71 to 1.00; positive predictive value ranged from 0.67 to 1.00; and negative predictive value ranged from 0.69 to 0.90. For software packages that were installed at different locations, between-installation differences were observed. CONCLUSION: The performance of most DDI-detecting software programs tested in this study was suboptimal. Improvement is needed to advance their contribution to detection of DDIs.
OBJECTIVE: To evaluate the performance of computerized drug-drug interaction (DDI) software in identifying clinically important drug-drug interactions. DESIGN: One-time performance test of computer systems using a standard set of prescriptions. SETTING: Community pharmacies or central corporate locations with pharmacy terminals identical to those used in actual pharmacies. PARTICIPANTS: Chain and health maintenance organization (HMO) pharmacies with seven or more practice sites in Washington State. A total of nine different DDI software programs were installed in 516 community pharmacies represented by these chains and HMOs. MAIN OUTCOME MEASURES: Sensitivity, specificity, and positive and negative predictive values of software in detecting 16 well-established DDIs contained within six fictitious patient profiles. RESULTS: The software systems failed to detect clinically relevant DDIs one-third of the time. Sensitivity of the software programs ranged from 0.44 to 0.88, with 1.00 being perfect; specificity ranged from 0.71 to 1.00; positive predictive value ranged from 0.67 to 1.00; and negative predictive value ranged from 0.69 to 0.90. For software packages that were installed at different locations, between-installation differences were observed. CONCLUSION: The performance of most DDI-detecting software programs tested in this study was suboptimal. Improvement is needed to advance their contribution to detection of DDIs.
Authors: Kim R Saverno; Lisa E Hines; Terri L Warholak; Amy J Grizzle; Lauren Babits; Courtney Clark; Ann M Taylor; Daniel C Malone Journal: J Am Med Inform Assoc Date: 2010-12-03 Impact factor: 4.497
Authors: Eric N van Roon; Sander Flikweert; Marianne le Comte; Pim N J Langendijk; Wilma J M Kwee-Zuiderwijk; Paul Smits; Jacobus R B J Brouwers Journal: Drug Saf Date: 2005 Impact factor: 5.606
Authors: Serkan Ayvaz; John Horn; Oktie Hassanzadeh; Qian Zhu; Johann Stan; Nicholas P Tatonetti; Santiago Vilar; Mathias Brochhausen; Matthias Samwald; Majid Rastegar-Mojarad; Michel Dumontier; Richard D Boyce Journal: J Biomed Inform Date: 2015-04-24 Impact factor: 6.317
Authors: Olesya I Zorina; Patrick Haueis; Waldemar Greil; Renate Grohmann; Gerd A Kullak-Ublick; Stefan Russmann Journal: Drug Saf Date: 2013-04 Impact factor: 5.606