Literature DB >> 32590758

Efficacy of deep convolutional neural network algorithm for the identification and classification of dental implant systems, using panoramic and periapical radiographs: A pilot study.

Jae-Hong Lee1, Seong-Nyum Jeong.   

Abstract

Convolutional neural networks (CNNs), a particular type of deep learning architecture, are positioned to become one of the most transformative technologies for medical applications. The aim of the current study was to evaluate the efficacy of deep CNN algorithm for the identification and classification of dental implant systems.A total of 5390 panoramic and 5380 periapical radiographic images from 3 types of dental implant systems, with similar shape and internal conical connection, were randomly divided into training and validation dataset (80%) and a test dataset (20%). We performed image preprocessing and transfer learning techniques, based on fine-tuned and pre-trained deep CNN architecture (GoogLeNet Inception-v3). The test dataset was used to assess the accuracy, sensitivity, specificity, receiver operating characteristic curve, area under the receiver operating characteristic curve (AUC), and confusion matrix compared between deep CNN and periodontal specialist.We found that the deep CNN architecture (AUC = 0.971, 95% confidence interval 0.963-0.978) and board-certified periodontist (AUC = 0.925, 95% confidence interval 0.913-0.935) showed reliable classification accuracies.This study demonstrated that deep CNN architecture is useful for the identification and classification of dental implant systems using panoramic and periapical radiographic images.

Entities:  

Mesh:

Substances:

Year:  2020        PMID: 32590758      PMCID: PMC7328970          DOI: 10.1097/MD.0000000000020787

Source DB:  PubMed          Journal:  Medicine (Baltimore)        ISSN: 0025-7974            Impact factor:   1.817


Introduction

Dental implants are used to replace or reconstruct missing teeth. Systematic and meta-analytic studies published in recent decades show a long-term success and survival rate of more than 10 years in over 90% of the cases.[ However, despite the fact that dental implants have become a widespread and rapidly increasing treatment option, mechanical and biological complications occur frequently. Ultimately, the possibility of failure is also steadily increasing.[ In a long-term systematic review, the cumulative mechanical complication incidence rate over a period of 5 years was reported to be 12.7% for loosening of screws or abutments and 0.35% for screw or abutment fractures.[ Also observed at a large multicenter study, a total of 19,087 implant cases were monitored over 9 years, and confirmed 70 fixture fractures (0.4%).[ Another systematic review of biological complications reported the prevalence of peri-implant mucositis and peri-implantitis of up to 65% and 47%, respectively.[ More than hundreds of manufacturers produce over 4000 different types of dental implant systems globally.[ A wide variety of fixture structures (straight, tapered, conical, ovoid, trapezoidal, internal, and external) with different surface treatment techniques (machined, blasted, acid-etched, hydroxyapatite-coated, titanium plasma-sprayed, and oxidized) are continuously being developed and clinically applied.[ Therefore, if clinical dental practitioners cannot identify and classify the dental implant systems when mechanical and biological complications occur, there is an increased probability of invasive treatment modalities for repair or replantation.[ Although panoramic and periapical radiographs are the primary means for identifying and classifying dental implant systems, it is exceedingly difficult to distinguish different systems with similar shapes and features through radiographs. This is due to significant inherent weaknesses, such as noise, haziness, and distortion.[ Computer-aided diagnostic systems have shown good efficiency and improved outcomes when applied in various medical and dental fields. In particular, among popular research technologies of deep learning, convolutional neural networks (CNNs) have developed rapidly in recent years and demonstrate excellent performance in regards to image analysis such as detection, classification, and segmentation.[ However, despite the excellent performance and reliability of deep CNN algorithms, basic research and clinical application in the dental field are vastly limited. The purpose of this study was to demonstrate the efficacy of deep CNN algorithm for the identification and classification of dental implant systems using panoramic and periapical radiographs.

Materials and methods

Datasets

This study was conducted at the Department of Periodontology, Daejeon Dental Hospital, Wonkwang University, and all image datasets were anonymized and separated from any personal identifiers. The research protocol was approved by the Institutional Review Board of Daejeon Dental Hospital, Wonkwang University (approval no. W1809/001-001). Raw panoramic and periapical radiographic images (INFINITT PACS, Infinitt, Seoul, Korea) of patients who underwent dental implant treatment at the dental hospital were acquired between January 2010 and December 2019. Three types of dental implant systems – TSIII SA, Osstem Implant Co. Ltd., Seoul, Korea; Superline, Dentium Co. Ltd., Seoul, Korea; SLActive BLT implant, Institut Straumann AG, Basel, Switzerland – were classified and each dental implant system was labeled based on electronic dental records. These dental implant systems have a sandblasted, large-grit, acid-etched surface, and an internal conical connection with similar tapered structure in common (Fig. 1).
Figure 1

Three types of dental implant systems have a sandblasted, large-grit, acid-etched surface, and an internal connection with similar tapered morphology in common.

Osstem TSIII implant system: fixtures with a diameter of 3.5 to 5.0 mm and length of 7 to 13 mm, designed with double and corkscrew thread, helix cutting edge, and an apical taper angle of 1.5°. Dentium Superline implant system: fixtures with a diameter of 3.6 to 5.0 mm and length of 8 to 12 mm, designed with double threads, and a long cutting edge. Straumann BLT implant system: fixtures with a diameter of 3.3 to 4.8 mm and length of 8 to 12 mm, designed with full-depth threads, 3 cutting notches, and an apical taper angle of 9°. Three types of dental implant systems have a sandblasted, large-grit, acid-etched surface, and an internal connection with similar tapered morphology in common.

Preprocessing and image augmentation

The regions of interest, which displayed only 1 implant fixture per image, were manually cropped and labeled by 3 periodontology residents who were not directly involved in the study using radiographic image analysis software (Osirix X 10.0 64-bit version; Pixmeo SARL). Images with severe noise, haziness, and distortion were excluded. The remaining images included in this study were calibrated according to contrast and brightness using global contrast normalization and zero phase whitening.[ The average value, X̄, and the standard deviation, σ, of each image were obtained, and global contrast normalization of the image data was performed as follows: X←(X – X̄) / σ. The final dataset consisted of 10,770 cropped radiographic images (extracted from 5390 panoramic and 5380 periapical radiographic images). Including 4600 Osstem TSIII implant systems (extracted from 2340 [50.9%] panoramic and 2260 [49.1%] periapical radiographic images), 4370 Dentium Superline implant systems (extracted from 2160 [49.4%] panoramic and 2210 [50.6%] periapical radiographic images), and 1800 Straumann BLT implant systems (extracted from 890 [49.4%] panoramic and 910 [50.6%] periapical radiographic images), as shown in Figure 2. The dataset was randomly divided into 3 groups: training dataset (n = 6,462 [60%]), validation dataset (n = 2,154 [20%]), and test dataset (n = 2,154 [20%]). The training dataset was randomly augmented 10 times (n = 64,620) using horizontal and vertical flip, rotation (range of 10°), width and height shifting (range of 0.1), and zooming (range of 0.8–1.2). The test dataset was allocated at the same ratio of 200 to each dental implant system.[
Figure 2

The dataset consisted of a total of cropped 10,770 radiographic images (extracted from 5390 panoramic and 5380 periapical radiographic images), consisting of 4600 Osstem TSIII, 4370 Dentium Superline, and 1800 Straumann BLT dental implant systems.

The dataset consisted of a total of cropped 10,770 radiographic images (extracted from 5390 panoramic and 5380 periapical radiographic images), consisting of 4600 Osstem TSIII, 4370 Dentium Superline, and 1800 Straumann BLT dental implant systems.

Architecture of the deep convolutional neural network

In this study, we explored the representative deep CNN architecture, GoogLeNet Inception v3, which achieved excellent performance in high-level feature abstraction.[ This architecture was developed by the Google research team and consisted of 9 inception modules, including an auxiliary classifier, 2 fully connected layers, softmax functions, and 22 dense layers.[ A pre-trained model on ImageNet was used for preprocessing and transfer learning. This indicated that the architectures were capable of learning comprehensive natural features from approximately 1.28 million images, consisting of 1000 object categories.[ Additional fine-tuning was performed by optimizing the weights, and each architecture was trained for 1000 epochs.[ (Fig. 3)
Figure 3

Overall scheme and overview representing the GoogLeNet Inception-v3 architecture. Implementations of the pre-trained convolutional neural network model using transfer learning. The dataset for the implant fixture images was obtained by cropping the regions of interests and using as input. The final output layer performs softmax classification and provides the predictions.

Overall scheme and overview representing the GoogLeNet Inception-v3 architecture. Implementations of the pre-trained convolutional neural network model using transfer learning. The dataset for the implant fixture images was obtained by cropping the regions of interests and using as input. The final output layer performs softmax classification and provides the predictions.

Comparing the performance of the deep CNN architecture to human expert

A total of 2154 radiographic images (718 images evenly for each dental implant system) were randomly selected from the test dataset by a computer aided tool (Keras framework in Python (Python 3.6.1, Python Software Foundation, Wilmington, DE). Then, the accuracy performance of the trained deep CNN architecture and board-certified periodontist (JHL) was evaluated using the testing dataset.

Statistical analysis

All statistical analyses were performed using the Keras framework in Python (Python 3.6.1, Python Software Foundation, Wilmington, DE) and MedCalc statistical package (version 12.7.0, Mariakerke, Belgium), and the accuracy performance of the test dataset was evaluated using the receiver operating characteristic curve, 95% confidence intervals (CIs), and confusion matrix.

Results

Classification of dental implant systems

Table 1 and Figure 4 show a comparison of receiver operating characteristic curves for appraising the accuracy of the deep CNN architecture and periodontist. Using combined panoramic and periapical radiographic images, the deep CNN architecture had an AUC of 0.971 (95% CI, 0.963–0.978), while the corresponding values for the periodontist were 0.925 (95% CI, 0.913–0.935). Using only panoramic radiographic images, the deep CNN architecture had an AUC of 0.956 (95% CI, 0.942–0.967), while the corresponding values for the periodontist were 0.891 (95% CI, 0.871–0.909). Using only periapical radiographic images, the deep CNN architecture had an AUC of 0.979 (95% CI, 0.969–0.987), while the corresponding values for the periodontist were 0.979 (95% CI, 0.969–0.987), respectively (Table 1. and Fig. 4).
Table 1

Comparison between the deep convolutional neural networks algorithm and periodontist for the identification and classification of 3 types of dental implant systems.

Figure 4

Comparison of receiver operating characteristic curves of the deep convolutional neural network (CNN) architecture and the periodontist. (A) Dataset consisted of 2154 panoramic and periapical radiographic images, (B) Dataset consisted of 1078 panoramic radiographic images, and (C) Dataset consisted of 1076 periapical radiographic images.

Comparison between the deep convolutional neural networks algorithm and periodontist for the identification and classification of 3 types of dental implant systems. Comparison of receiver operating characteristic curves of the deep convolutional neural network (CNN) architecture and the periodontist. (A) Dataset consisted of 2154 panoramic and periapical radiographic images, (B) Dataset consisted of 1078 panoramic radiographic images, and (C) Dataset consisted of 1076 periapical radiographic images.

Confusion matrix

Another result analyzed was the confusion matrix of the multiclass classification of dental implant systems, based on deep CNN architecture using the training dataset (Fig. 5). The accuracy of Straumann BLT implant system (panoramic radiographic images: 99.4% and periapical radiographic images: 99.5%) was the highest among the three types of dental implant systems.
Figure 5

Multiclass classification confusion matrix using deep convolutional neural network architecture. (A) Panoramic radiographic images without normalization, (B) Periapical radiographic images without normalization.

Multiclass classification confusion matrix using deep convolutional neural network architecture. (A) Panoramic radiographic images without normalization, (B) Periapical radiographic images without normalization.

Discussion

Using dental implants for the successful rehabilitation of those who are partially or fully edentulous is growing rapidly over time. However, the number of dental implants that cannot be identified, because of the absence of available dental records, is also increasing. In addition, dental practitioners with relatively short clinical experience have more difficulty distinguishing between various designs of dental implant systems because of their limited firsthand observations. In the past, effort has been made to identify dental implant systems from a forensic, medical viewpoint. However, these endeavors were conducted with a very small number of datasets, or resulted in low accuracy for practical clinical application.[ Sahiwal et al attempted to recognize dental implant systems based on radiographic images; however, only 20 images per dental implant were used, and as a result, the dental implant system could only be accurately recognized within 10 degrees of vertical angulation.[ Michelinakis et al also developed a computer-aided diagnostic based recognition software to identify dental implant systems, however, it has the limitation of manually recording the characteristics of the dental implant (such as diameter, length, type of thread, surface property, and collar shape).[ Before conducting this study, we compared the accuracy performance of the classification of dental implant systems using 3 major CNN architectures with and without transfer learning (VGG-19, Inception-v3, and ResNet-50) to find the optimal model. Although all 3 algorithms had reliable results, the pre-trained Inception-v3 architecture showed the best performance (AUC = 0.922, 95% CI 0.876–0.955), and therefore, we adopted the Inception-v3 architecture in this study.[ The GoogLeNet Inception-v3 architecture, which was developed in 2014 and modified in 2016, showed excellent multicategorical (or multiclass) image classification and object detection performance in the annual ImageNet Large Scale Visual Recognition Challenge.[ Therefore, this architecture has been widely adopted in various medical and dental fields, such as clinical diagnosis and therapeutic aspects. In particular, it has demonstrated superior performance in detecting and classifying diabetic retinopathy in retinal fundus photographs, pulmonary tuberculosis from chest radiographs, skin cancers from skin photographs, and cystic lesions from panoramic and cone beam computed tomography radiographs.[ Our goal was to learn discriminative features for contour identification and classification, using the well-known and very effective deep CNN architecture. To the best of our knowledge, this study is the first to evaluate the efficacy of deep CNN architecture using panoramic and periapical radiographic images. We demonstrated that the GoogLeNet Inception-v3 architecture provided a reliable performance (AUC between 0.956 and 0.979) and superiority when compared to the board-certified periodontist (AUC between 0.891 and 0.959). In particular, the Straumann BLT implant system had the highest accuracy in panoramic and periapical radiographic images. This result is considered to be due to the largest taper of the Straumann BLT implant system and is considered as one of the major limitations of the dataset collection in this study. We retained the cropped, 10,770 panoramic and periapical radiographic images from the 3 categories. The images were relatively small for performing reliable training and testing with deep CNN architecture. To overcome this limitation, and to avoid overfitting, the training dataset was augmented 10-fold at random and a fine-tuning strategy with transfer learning was performed manually and meticulously.[ Another limitation encountered was that each dental implant with the same base system was still different in structure, depending on its diameter and length. However, our dataset did not consider these factors. As already mentioned in the introduction, although there are a large number of dental implant systems with different designs, only three types of dental implant systems were included in the dataset, which limits their practical use. In recent years, the efficacy of deep learning has been actively investigated for a 3-dimensional dataset source. A variety of deep CNN architectures have already been specialized and optimized based on 3-dimensional computed tomographic images In contrast, 2-dimensional images (including dental panoramic and periapical radiography) are more distorted than 3-dimensional images. This factor is another major limitation that impedes the clear identification and classification of dental implant systems.[ Therefore, if additional information (e.g., exact diameter and length, based on a 3-dimensional implant image using dental cone-beam computed tomography) is included in the dataset, the classification accuracy can be improved.

Conclusions

Deep learning is predicted to become one of the most transformative technologies for dental applications. We found that deep CNN architecture was useful for the identification and classification of dental implant systems by using panoramic and periapical radiographic images. Further studies should concentrate on the effectiveness of deep CNN architectures with high quality and quantity datasets, obtained from clinical dental practices.

Acknowledgments

We would like to thank the periodontology residents (Dr Eun-Hee Jeong, Dr Bo-Ram Nam, and Dr Do-Hyung Kim) who helped prepare the dataset for this study.

Author contributions

Conceptualization: Jae-Hong Lee. Data curation: Jae-Hong Lee, Seong-Nyum Jeong. Formal analysis: Jae-Hong Lee, Seong-Nyum Jeong. Funding acquisition: Jae-Hong Lee. Investigation: Jae-Hong Lee, Seong-Nyum Jeong. Methodology: Jae-Hong Lee, Seong-Nyum Jeong. Project administration: Jae-Hong Lee, Seong-Nyum Jeong. Resources: Jae-Hong Lee, Seong-Nyum Jeong. Validation: Jae-Hong Lee, Seong-Nyum Jeong. Writing – original draft: Jae-Hong Lee, Seong-Nyum Jeong. Writing – review & editing: Jae-Hong Lee, Seong-Nyum Jeong.
  31 in total

Review 1.  Differential diagnosis and treatment strategies for biologic complications and failing oral implants: a review of the literature.

Authors:  M Esposito; J Hirsch; U Lekholm; P Thomsen
Journal:  Int J Oral Maxillofac Implants       Date:  1999 Jul-Aug       Impact factor: 2.804

2.  Independent component analysis: algorithms and applications.

Authors:  A Hyvärinen; E Oja
Journal:  Neural Netw       Date:  2000 May-Jun

3.  Deep Learning at Chest Radiography: Automated Classification of Pulmonary Tuberculosis by Using Convolutional Neural Networks.

Authors:  Paras Lakhani; Baskaran Sundaram
Journal:  Radiology       Date:  2017-04-24       Impact factor: 11.105

Review 4.  Survival rates of short (6 mm) micro-rough surface implants: a review of literature and meta-analysis.

Authors:  Murali Srinivasan; Lydia Vazquez; Philippe Rieder; Osvaldo Moraguez; Jean-Pierre Bernard; Urs C Belser
Journal:  Clin Oral Implants Res       Date:  2013-02-18       Impact factor: 5.977

5.  Development and Validation of a Deep Learning Algorithm for Detection of Diabetic Retinopathy in Retinal Fundus Photographs.

Authors:  Varun Gulshan; Lily Peng; Marc Coram; Martin C Stumpe; Derek Wu; Arunachalam Narayanaswamy; Subhashini Venugopalan; Kasumi Widner; Tom Madams; Jorge Cuadros; Ramasamy Kim; Rajiv Raman; Philip C Nelson; Jessica L Mega; Dale R Webster
Journal:  JAMA       Date:  2016-12-13       Impact factor: 56.272

6.  Systematic review of clinical and patient-reported outcomes following oral rehabilitation on dental implants with a tapered compared to a non-tapered implant design.

Authors:  Asbjørn Jokstad; Jeffrey Ganeles
Journal:  Clin Oral Implants Res       Date:  2018-10       Impact factor: 5.977

7.  Dermatologist-level classification of skin cancer with deep neural networks.

Authors:  Andre Esteva; Brett Kuprel; Roberto A Novoa; Justin Ko; Susan M Swetter; Helen M Blau; Sebastian Thrun
Journal:  Nature       Date:  2017-01-25       Impact factor: 49.962

Review 8.  Peri-implant health and disease. A systematic review of current epidemiology.

Authors:  Jan Derks; Cristiano Tomasi
Journal:  J Clin Periodontol       Date:  2015-04       Impact factor: 8.728

Review 9.  A systematic review of the 5-year survival and complication rates of implant-supported single crowns.

Authors:  Ronald E Jung; Bjarni E Pjetursson; Roland Glauser; Anja Zembic; Marcel Zwahlen; Niklaus P Lang
Journal:  Clin Oral Implants Res       Date:  2007-12-07       Impact factor: 5.977

10.  Mechanical and biological complication rates of the modified lateral-screw-retained implant prosthesis in the posterior region: an alternative to the conventional Implant prosthetic system.

Authors:  Jae-Hong Lee; Jong-Bin Lee; Man-Yong Kim; Joon-Ho Yoon; Seong-Ho Choi; Young-Taek Kim
Journal:  J Adv Prosthodont       Date:  2016-04-21       Impact factor: 1.904

View more
  15 in total

1.  Current applications and development of artificial intelligence for digital dental radiography.

Authors:  Ramadhan Hardani Putra; Chiaki Doi; Nobuhiro Yoda; Eha Renwi Astuti; Keiichi Sasaki
Journal:  Dentomaxillofac Radiol       Date:  2021-07-08       Impact factor: 2.419

Review 2.  Transfer learning for medical image classification: a literature review.

Authors:  Mate E Maros; Thomas Ganslandt; Hee E Kim; Alejandro Cosa-Linan; Nandhini Santhanam; Mahboubeh Jannesari
Journal:  BMC Med Imaging       Date:  2022-04-13       Impact factor: 1.930

3.  A Performance Comparison between Automated Deep Learning and Dental Professionals in Classification of Dental Implant Systems from Dental Imaging: A Multi-Center Study.

Authors:  Jae-Hong Lee; Young-Taek Kim; Jong-Bin Lee; Seong-Nyum Jeong
Journal:  Diagnostics (Basel)       Date:  2020-11-07

4.  Artificial Intelligence in Fractured Dental Implant Detection and Classification: Evaluation Using Dataset from Two Dental Hospitals.

Authors:  Dong-Woon Lee; Sung-Yong Kim; Seong-Nyum Jeong; Jae-Hong Lee
Journal:  Diagnostics (Basel)       Date:  2021-02-03

5.  Multispectral Image under Tissue Classification Algorithm in Screening of Cervical Cancer.

Authors:  Pei Wang; Shuwei Wang; Yuan Zhang; Xiaoyan Duan
Journal:  J Healthc Eng       Date:  2022-01-07       Impact factor: 2.682

6.  Machine learning for identification of dental implant systems based on shape - A descriptive study.

Authors:  Veena Basappa Benakatti; Ramesh P Nayakar; Mallikarjun Anandhalli
Journal:  J Indian Prosthodont Soc       Date:  2021 Oct-Dec

Review 7.  Scope and challenges of machine learning-based diagnosis and prognosis in clinical dentistry: A literature review.

Authors:  Lilian Toledo Reyes; Jessica Klöckner Knorst; Fernanda Ruffo Ortiz; Thiago Machado Ardenghi
Journal:  J Clin Transl Res       Date:  2021-07-30

8.  Transfer learning in a deep convolutional neural network for implant fixture classification: A pilot study.

Authors:  Hak-Sun Kim; Eun-Gyu Ha; Young Hyun Kim; Kug Jin Jeon; Chena Lee; Sang-Sun Han
Journal:  Imaging Sci Dent       Date:  2022-03-15

9.  Multi-Task Deep Learning Model for Classification of Dental Implant Brand and Treatment Stage Using Dental Panoramic Radiograph Images.

Authors:  Shintaro Sukegawa; Kazumasa Yoshii; Takeshi Hara; Tamamo Matsuyama; Katsusuke Yamashita; Keisuke Nakano; Kiyofumi Takabatake; Hotaka Kawai; Hitoshi Nagatsuka; Yoshihiko Furuki
Journal:  Biomolecules       Date:  2021-05-30

10.  Artificial Intelligence-Based Solution in Personalized Computer-Aided Arthroscopy of Shoulder Prostheses.

Authors:  Haseeb Sultan; Muhammad Owais; Jiho Choi; Tahir Mahmood; Adnan Haider; Nadeem Ullah; Kang Ryoung Park
Journal:  J Pers Med       Date:  2022-01-14
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.