| Literature DB >> 34900177 |
Ke Zeng1,2, Yingqi Hua1, Jing Xu1, Tao Zhang1, Zhuoying Wang1, Yafei Jiang1, Jing Han1, Mengkai Yang1, Jiakang Shen1, Zhengdong Cai1.
Abstract
Knee osteoarthritis (OA) is one of the most common musculoskeletal disorders. OA diagnosis is currently conducted by assessing symptoms and evaluating plain radiographs, but this process suffers from the subjectivity of doctors. In this study, we retrospectively compared five commonly used machine learning methods, especially the CNN network, to predict the real-world X-ray imaging data of knee joints from two different hospitals using Kellgren-Lawrence (K-L) grade of knee OA to help doctors choose proper auxiliary tools. Furthermore, we present attention maps of CNN to highlight the radiological features affecting the network decision. Such information makes the decision process transparent for practitioners, which builds better trust towards such automatic methods and, moreover, reduces the workload of clinicians, especially for remote areas without enough medical staff.Entities:
Mesh:
Year: 2021 PMID: 34900177 PMCID: PMC8664510 DOI: 10.1155/2021/1765404
Source DB: PubMed Journal: J Healthc Eng ISSN: 2040-2295 Impact factor: 2.682
Figure 1Sample graphs of ROI.
Description and imageological features of Kellgren-Lawrence (K-L) grade.
| Kellgren-Lawrence (K-L) grading scale | |||||
|---|---|---|---|---|---|
| Classification | Grade 0 normal | Grade 1 doubtful | Grade 2 mild | Grade 3 moderate | Grade 4 severe |
| Description | No signs of OA | Mild osteophyte: normal joint space | Specific osteophyte: normal joint space | Moderate joint space reduction | Joint space greatly reduced: subchondral sclerosis |
|
| |||||
| X-ray image |
|
|
|
|
|
The five algorithms used in this study.
| Algorithm | Introduction |
|---|---|
| Support vector machine, SVM | A class of generalized linear classifiers that binarizes data in supervised learning, with a decision boundary that is the maximum margin hyperplane for learning sample solving |
|
| |
| Naïve Bayes, NB | The Naive Bayes classification (NBC) is based on the Bayes theology and assumes that the characteristic conditions are independent of each other, first through the given training set, with the characteristic words independent as the premise hypothesis, learning from the input to the output of the joint probability distribution, and then, based on the learned model, input |
|
| |
| k-nearest neighbors, KNN | KNN's principle is that when predicting a new value |
|
| |
| Radial basis function neural network, RBF | There is one hidden node, including “ |
|
| |
| Convolutional neural networks, CNNs | Convolutional neural network (CNN) is a type of feedforward neural network, which contains convolutional calculations with a deep structure and is one of the representative algorithms of deep learning. CNN has the ability to quantify learning and classify input information translation through class structure (displacement invariant classification), so it is also called “displacement invariant artificial neural network (SIANN)” |
Figure 2Flow diagram of the overall design.
Figure 3RBF network mode.
Figure 4SVM network's separate superplanes.
Figure 5Foreground extraction of the knee joint.
Figure 6Sample graphs of the heat map.
Figure 7The ROC curve of the KNN classifier.
Figure 8The results of the NB classifier.
Figure 9The results of the SVM classifier.
Figure 10The results of the RBF classifier.
Figure 11The results of the CNN classifier.