| Literature DB >> 29572387 |
Miao Wu1, Chuanbo Yan2, Huiqiang Liu2, Qian Liu3.
Abstract
Ovarian cancer is one of the most common gynecologic malignancies. Accurate classification of ovarian cancer types (serous carcinoma, mucous carcinoma, endometrioid carcinoma, transparent cell carcinoma) is an essential part in the different diagnosis. Computer-aided diagnosis (CADx) can provide useful advice for pathologists to determine the diagnosis correctly. In our study, we employed a Deep Convolutional Neural Networks (DCNN) based on AlexNet to automatically classify the different types of ovarian cancers from cytological images. The DCNN consists of five convolutional layers, three max pooling layers, and two full reconnect layers. Then we trained the model by two group input data separately, one was original image data and the other one was augmented image data including image enhancement and image rotation. The testing results are obtained by the method of 10-fold cross-validation, showing that the accuracy of classification models has been improved from 72.76 to 78.20% by using augmented images as training data. The developed scheme was useful for classifying ovarian cancers from cytological images.Entities:
Keywords: Classification; Cytological Images; Deep Convolutional Neural Networks; Ovarian Cancer Types
Mesh:
Year: 2018 PMID: 29572387 PMCID: PMC5938423 DOI: 10.1042/BSR20180289
Source DB: PubMed Journal: Biosci Rep ISSN: 0144-8463 Impact factor: 3.840
Figure 1Cytological images preprocessing for automatic classification of ovarian cancer by DCNN
Figure 2Image enhancement
Figure 3Image rotation
Figure 4The architecture and illustration of DCNN for ovarian cancer images classification
The number of images in each dataset for 10-fold cross-validation (‘O’ stands for original images and ‘A’ stands for augmented images)
| Serous | Mucinous | Endometrioid | Clear cell | |||||
|---|---|---|---|---|---|---|---|---|
| O | A | O | A | O | A | O | A | |
| Dataset1 | 42 | 462 | 48 | 528 | 42 | 462 | 41 | 451 |
| Dataset2 | 41 | 451 | 50 | 550 | 45 | 495 | 40 | 440 |
| Dataset3 | 54 | 594 | 41 | 451 | 47 | 517 | 40 | 440 |
| Dataset4 | 52 | 572 | 40 | 440 | 54 | 594 | 52 | 572 |
| Dataset5 | 51 | 561 | 44 | 484 | 46 | 506 | 46 | 506 |
| Dataset6 | 52 | 572 | 50 | 550 | 50 | 550 | 42 | 462 |
| Dataset7 | 47 | 517 | 46 | 506 | 47 | 517 | 40 | 440 |
| Dataset8 | 46 | 506 | 41 | 451 | 53 | 583 | 40 | 440 |
| Dataset9 | 48 | 528 | 51 | 561 | 53 | 583 | 45 | 495 |
| Dataset10 | 48 | 528 | 42 | 462 | 47 | 517 | 44 | 484 |
The classification accuracies for two models
| Original | Augmented | |
|---|---|---|
| Serous | 82.33% | 84.14% |
| Mucinous | 71.62% | 77.51% |
| Endometrioid | 64.53% | 72.93% |
| Clear cell | 72.57% | 78.21% |
| Total | 72.76% | 78.20% |
Confusion matrix of classification results generated by the DCNN model trained and tested by augmented data
| Serous | Mucinous | Endometrioid | Clear cell | |
|---|---|---|---|---|
| Serous | 84.14% (4452) | 2.34% (124) | 6.46% (342) | 7.06% (374) |
| Mucinous | 4.21% (210) | 77.51% (3862) | 5.64% (281) | 12.64% (630) |
| Endometrioid | 15.11% (804) | 9.70% (516) | 72.93% (3883) | 2.26% (120) |
| Clear cell | 3.76% (178) | 11.39% (539) | 6.64% (314) | 78.21% (3699) |
Figure 5Misclassified ovarian cancer images by DCNN