OBJECTIVES: The commonly used methods of chart review, billing data summaries and practitioner self-reporting have not been examined for their ability to validly and reliably represent time use and service delivery in routine dental practice. A more thorough investigation of these data sources would provide insight into the appropriateness of each approach for measuring various clinical behaviors. The aim of this study was to assess the validity of commonly used methods such as dental chart review, billing data, or practitioner self-report compared with a 'gold standard' of information derived from direct observation of routine dental visits. METHODS: A team of trained dental hygienists directly observed 3751 patient visits in 120 dental practices and recorded the behaviors and procedures performed by dentists and hygienists during patient contact time. Following each visit, charts and billing records were reviewed for the performed and billed procedures. Dental providers characterized their frequency of preventive service delivery through self-administered surveys. We standardized the observation and abstraction methods to obtain optimal measures from each of the multiple data sources. Multi-rater kappa coefficients were computed to monitor standardization, while sensitivity, specificity, and kappa coefficients were calculated to compare the various data sources with direct observation. RESULTS: Chart audits were more sensitive than billing data for all observed procedures and demonstrated higher agreement with directly observed data. Chart and billing records were not sensitive for several prevention-related tasks (oral cancer screening and oral hygiene instruction). Provider self-reports of preventive behaviors were always over-estimated compared with direct observation. Inter-method reliability kappa coefficients for 13 procedures ranged from 0.197 to 0.952. CONCLUSIONS: These concordance findings suggest that strengths and weaknesses of data collection sources should be considered when investigating delivery of dental services especially when using practitioner survey data. Future investigations can more fully rely on charted information rather than billing data and provider self-report for most dental procedures, but nonbillable procedures and most counseling interactions will not be captured with routine charting and billing practices.
OBJECTIVES: The commonly used methods of chart review, billing data summaries and practitioner self-reporting have not been examined for their ability to validly and reliably represent time use and service delivery in routine dental practice. A more thorough investigation of these data sources would provide insight into the appropriateness of each approach for measuring various clinical behaviors. The aim of this study was to assess the validity of commonly used methods such as dental chart review, billing data, or practitioner self-report compared with a 'gold standard' of information derived from direct observation of routine dental visits. METHODS: A team of trained dental hygienists directly observed 3751 patient visits in 120 dental practices and recorded the behaviors and procedures performed by dentists and hygienists during patient contact time. Following each visit, charts and billing records were reviewed for the performed and billed procedures. Dental providers characterized their frequency of preventive service delivery through self-administered surveys. We standardized the observation and abstraction methods to obtain optimal measures from each of the multiple data sources. Multi-rater kappa coefficients were computed to monitor standardization, while sensitivity, specificity, and kappa coefficients were calculated to compare the various data sources with direct observation. RESULTS: Chart audits were more sensitive than billing data for all observed procedures and demonstrated higher agreement with directly observed data. Chart and billing records were not sensitive for several prevention-related tasks (oral cancer screening and oral hygiene instruction). Provider self-reports of preventive behaviors were always over-estimated compared with direct observation. Inter-method reliability kappa coefficients for 13 procedures ranged from 0.197 to 0.952. CONCLUSIONS: These concordance findings suggest that strengths and weaknesses of data collection sources should be considered when investigating delivery of dental services especially when using practitioner survey data. Future investigations can more fully rely on charted information rather than billing data and provider self-report for most dental procedures, but nonbillable procedures and most counseling interactions will not be captured with routine charting and billing practices.
Authors: Joseph L Riley; Valeria V Gordan; D Brad Rindal; Jeffrey L Fellows; Craig T Ajmo; Craig Amundson; Gerald A Anderson; Gregg H Gilbert Journal: Community Dent Oral Epidemiol Date: 2010-05-18 Impact factor: 3.383
Authors: Andreea Voinea-Griffin; Jeffrey L Fellows; Donald B Rindal; Andrei Barasch; Gregg H Gilbert; Monika M Safford Journal: BMC Oral Health Date: 2010-04-28 Impact factor: 2.757
Authors: Raymond A Kuthy; Bhagyashree Pendharkar; Elizabeth T Momany; Michael P Jones; Natoshia M Askelson; Donald L Chi; George L Wehby; Peter C Damiano Journal: Pediatr Dent Date: 2013 May-Jun Impact factor: 1.874
Authors: Stephen Wotman; Catherine A Demko; Kristin Victoroff; Joseph J Sudano; James A Lalumandier Journal: Clin Cosmet Investig Dent Date: 2010-05-26
Authors: Shilpa Tyagi; Gerald Choon-Huat Koh; Nan Luo; Kelvin Bryan Tan; Helen Hoenig; David B Matchar; Joanne Yoong; Eric A Finkelstein; Kim En Lee; N Venketasubramanian; Edward Menon; Kin Ming Chan; Deidre Anne De Silva; Philip Yap; Boon Yeow Tan; Effie Chew; Sherry H Young; Yee Sien Ng; Tian Ming Tu; Yan Hoon Ang; Keng He Kong; Rajinder Singh; Reshma A Merchant; Hui Meng Chang; Tseng Tsai Yeo; Chou Ning; Angela Cheong; Yu Li Ng; Chuen Seng Tan Journal: BMC Health Serv Res Date: 2018-10-25 Impact factor: 2.655