Kiyomi Kohinata1, Tomoya Kitano2, Wataru Nishiyama2, Mizuho Mori2, Yukihiro Iida2, Hiroshi Fujita3, Akitoshi Katsumata2. 1. Department of Oral Radiology, Asahi University School of Dentistry, Mizuho, Gifu, Japan. kohinata@dent.asahi-u.ac.jp. 2. Department of Oral Radiology, Asahi University School of Dentistry, Mizuho, Gifu, Japan. 3. Department of Electrical, Electronic and Computer Engineering Faculty of Engineering, Gifu University, Gifu, Japan.
Abstract
OBJECTIVE: This study explored the feasibility of using deep learning for profiling of panoramic radiographs. STUDY DESIGN: Panoramic radiographs of 1000 patients were used. Patients were categorized using seven dental or physical characteristics: age, gender, mixed or permanent dentition, number of presenting teeth, impacted wisdom tooth status, implant status, and prosthetic treatment status. A Neural Network Console (Sony Network Communications Inc., Tokyo, Japan) deep learning system and the VGG-Net deep convolutional neural network were used for classification. RESULTS: Dentition and prosthetic treatment status exhibited classification accuracies of 93.5% and 90.5%, respectively. Tooth number and implant status both exhibited 89.5% classification accuracy; impacted wisdom tooth status exhibited 69.0% classification accuracy. Age and gender exhibited classification accuracies of 56.0% and 75.5%, respectively. CONCLUSION: Our proposed preliminary profiling method may be useful for preliminary interpretation of panoramic images and preprocessing before the application of additional artificial intelligence techniques.
OBJECTIVE: This study explored the feasibility of using deep learning for profiling of panoramic radiographs. STUDY DESIGN: Panoramic radiographs of 1000 patients were used. Patients were categorized using seven dental or physical characteristics: age, gender, mixed or permanent dentition, number of presenting teeth, impacted wisdom tooth status, implant status, and prosthetic treatment status. A Neural Network Console (Sony Network Communications Inc., Tokyo, Japan) deep learning system and the VGG-Net deep convolutional neural network were used for classification. RESULTS: Dentition and prosthetic treatment status exhibited classification accuracies of 93.5% and 90.5%, respectively. Tooth number and implant status both exhibited 89.5% classification accuracy; impacted wisdom tooth status exhibited 69.0% classification accuracy. Age and gender exhibited classification accuracies of 56.0% and 75.5%, respectively. CONCLUSION: Our proposed preliminary profiling method may be useful for preliminary interpretation of panoramic images and preprocessing before the application of additional artificial intelligence techniques.