Mohamed Estai1,2, Marc Tennant2, Dieter Gebauer2,3, Andrew Brostek4, Janardhan Vignarajan1, Maryam Mehdizadeh1, Sajib Saha1. 1. The Australian e-Health Research Centre, CSIRO, Floreat, Australia. 2. School of Human Sciences, The University of Western Australia, Crawley, Australia. 3. Department of Oral and Maxillofacial Surgery, Royal Perth Hospital, Perth, Australia. 4. The UWA Dental School, The University of Western Australia, Crawley, Australia.
Abstract
OBJECTIVE: This study aimed to evaluate an automated detection system to detect and classify permanent teeth on orthopantomogram (OPG) images using convolutional neural networks (CNNs). METHODS: In total, 591 digital OPGs were collected from patients older than 18 years. Three qualified dentists performed individual teeth labelling on images to generate the ground truth annotations. A three-step procedure, relying upon CNNs, was proposed for automated detection and classification of teeth. Firstly, U-Net, a type of CNN, performed preliminary segmentation of tooth regions or detecting regions of interest (ROIs) on panoramic images. Secondly, the Faster R-CNN, an advanced object detection architecture, identified each tooth within the ROI determined by the U-Net. Thirdly, VGG-16 architecture classified each tooth into 32 categories, and a tooth number was assigned. A total of 17,135 teeth cropped from 591 radiographs were used to train and validate the tooth detection and tooth numbering modules. 90% of OPG images were used for training, and the remaining 10% were used for validation. 10-folds cross-validation was performed for measuring the performance. The intersection over union (IoU), F1 score, precision, and recall (i.e. sensitivity) were used as metrics to evaluate the performance of resultant CNNs. RESULTS: The ROI detection module had an IoU of 0.70. The tooth detection module achieved a recall of 0.99 and a precision of 0.99. The tooth numbering module had a recall, precision and F1 score of 0.98. CONCLUSION: The resultant automated method achieved high performance for automated tooth detection and numbering from OPG images. Deep learning can be helpful in the automatic filing of dental charts in general dentistry and forensic medicine.
OBJECTIVE: This study aimed to evaluate an automated detection system to detect and classify permanent teeth on orthopantomogram (OPG) images using convolutional neural networks (CNNs). METHODS: In total, 591 digital OPGs were collected from patients older than 18 years. Three qualified dentists performed individual teeth labelling on images to generate the ground truth annotations. A three-step procedure, relying upon CNNs, was proposed for automated detection and classification of teeth. Firstly, U-Net, a type of CNN, performed preliminary segmentation of tooth regions or detecting regions of interest (ROIs) on panoramic images. Secondly, the Faster R-CNN, an advanced object detection architecture, identified each tooth within the ROI determined by the U-Net. Thirdly, VGG-16 architecture classified each tooth into 32 categories, and a tooth number was assigned. A total of 17,135 teeth cropped from 591 radiographs were used to train and validate the tooth detection and tooth numbering modules. 90% of OPG images were used for training, and the remaining 10% were used for validation. 10-folds cross-validation was performed for measuring the performance. The intersection over union (IoU), F1 score, precision, and recall (i.e. sensitivity) were used as metrics to evaluate the performance of resultant CNNs. RESULTS: The ROI detection module had an IoU of 0.70. The tooth detection module achieved a recall of 0.99 and a precision of 0.99. The tooth numbering module had a recall, precision and F1 score of 0.98. CONCLUSION: The resultant automated method achieved high performance for automated tooth detection and numbering from OPG images. Deep learning can be helpful in the automatic filing of dental charts in general dentistry and forensic medicine.
Authors: Dmitry V Tuzoff; Lyudmila N Tuzova; Michael M Bornstein; Alexey S Krasnov; Max A Kharchenko; Sergey I Nikolenko; Mikhail M Sveshnikov; Georgiy B Bednenko Journal: Dentomaxillofac Radiol Date: 2019-03-05 Impact factor: 2.419
Authors: Xiaoxuan Liu; Livia Faes; Aditya U Kale; Siegfried K Wagner; Dun Jack Fu; Alice Bruynseels; Thushika Mahendiran; Gabriella Moraes; Mohith Shamdas; Christoph Kern; Joseph R Ledsam; Martin K Schmid; Konstantinos Balaskas; Eric J Topol; Lucas M Bachmann; Pearse A Keane; Alastair K Denniston Journal: Lancet Digit Health Date: 2019-09-25
Authors: Łukasz Zadrożny; Piotr Regulski; Katarzyna Brus-Sawczuk; Marta Czajkowska; Laszlo Parkanyi; Scott Ganz; Eitan Mijiritsky Journal: Diagnostics (Basel) Date: 2022-01-17
Authors: María Prados-Privado; Javier García Villalón; Antonio Blázquez Torres; Carlos Hugo Martínez-Martínez; Carlos Ivorra Journal: Biomed Res Int Date: 2021-12-14 Impact factor: 3.411