| Literature DB >> 33267070 |
Gabriel García1, Adrián Colomer1, Valery Naranjo1.
Abstract
Analysis of histopathological image supposes the most reliable procedure to identify prostate cancer. Most studies try to develop computer aid-systems to face the Gleason grading problem. On the contrary, we delve into the discrimination between healthy and cancerous tissues in its earliest stage, only focusing on the information contained in the automatically segmented gland candidates. We propose a hand-driven learning approach, in which we perform an exhaustive hand-crafted feature extraction stage combining in a novel way descriptors of morphology, texture, fractals and contextual information of the candidates under study. Then, we carry out an in-depth statistical analysis to select the most relevant features that constitute the inputs to the optimised machine-learning classifiers. Additionally, we apply for the first time on prostate segmented glands, deep-learning algorithms modifying the popular VGG19 neural network. We fine-tuned the last convolutional block of the architecture to provide the model specific knowledge about the gland images. The hand-driven learning approach, using a nonlinear Support Vector Machine, reports a slight outperforming over the rest of experiments with a final multi-class accuracy of 0.876 ± 0.026 in the discrimination between false glands (artefacts), benign glands and Gleason grade 3 glands.Entities:
Keywords: deep learning; feature selection; gland classification; hand-crafted feature extraction; hand-driven learning; histological image; prostate cancer
Year: 2019 PMID: 33267070 PMCID: PMC7514840 DOI: 10.3390/e21040356
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524
Figure 1Samples of histopathological prostate tissues with different patterns according to the Gleason scale. (a) Grade 2 (normal); (b) Grade 3; (c) Grade 4; (d) Grade 5.
Figure 2(a) example of a whole-slide image; (b) region of interest from which we perform the sliding window protocol; (c) sub-image of pixels from which we address the segmentation task; (d) gland candidate achieved after applying the LCWT segmentation method.
Figure 3Flowchart in which we expose the two approaches performed from histopathological prostate images in order to address the gland candidates’ classification.
Figure 4Process to obtain the binary map of each tissue component. (a) outputs after the colour space transformations; (b) labelled images achieved from the clustering stage; (c) masks obtained after the binarisation; (d) final maps of each tissue component.
Figure 5(a) segmented gland; (b) gland mask; (c) lumen mask.
Figure 6(a) RGB image; (b) cyan channel; (c) hematoxylin stain; (d) eosin contribution.
Figure 7(a) original bounding box corresponding to an RGB gland candidate image; (b) cyan channel of the specific gland candidate; (c) increments of the corresponding to the fractional Gaussian noise ; (d) 1D signal calculated from the for ; (e) PSD of the increments of all rows from the gland candidate bounding box.
Figure 8(a) example of GLCM achieved from a certain image I using an offset of [0,1]; (b) illustration of the two offset implemented in this paper to create each GLCM.
Figure 9(a) gland candidates related to an artefact, a benign and a pathological gland highlighted in black, green and red, respectively; (b) cyan channel of the the gland candidates; (c) image; (d) image; (e) 10-bin histograms of the after combining the images (c–d).
Figure 10(a) bar chart corresponding to the distribution of the first 20 variables overlapped on the Gaussian bell represented in red; (b) box plot relative to the discriminatory ability of the first 10 variables; (c) correlation matrix that visually shows the independence level between pairs of variables.
Features selected after applying the statistical analysis.
|
|
| |
|
| ||
|
|
| |
|
| ||
|
| ||
|
|
| |
|
| ||
|
| ||
|
|
| |
|
| ||
|
|
Figure 11(a) original input space; (b) rotation of the plane of data; (c) 3D space transformation where data are linearly separable; (d) representation of the classification boundary on the 2D plane.
Figure 12(a) 3D objective function that shows how the model find the optimal minimum modifying the hyperparameters C and ; (b) diagram that shows how to reach the minimum objective as the number of time increases.
Figure 13Illustrative example showing the forward-backward propagation algorithm from an MLP architecture composed of one hidden layer with fifteen neurons and an output layer with three classes.
Figure 14Network architecture used to construct predictive models from gland candidates of histopathological prostate images.
Classification results per gland candidate. (PPV–Positive Predictive Value; NPV–Negative Predictive Value; AUC–Area Under the Receiver Operating Characteristic (ROC) Curve).
|
|
| |||||
|---|---|---|---|---|---|---|
| SVM | MLP | VGG19 | SVM | MLP | VGG19 | |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Figure 15ROC curves achieved from the different hand-driven and deep-learning classifiers after evaluating the artefacts vs. glands problem and benign vs. pathological glands’ classification.
Accuracies comparison with other state-of-the-art studies performed at the gland level.
| Xia et al. [ | Nguyen et al. [ | Proposed Model | |
|---|---|---|---|
|
| - |
|
|
|
|
|
|
|
|
| - |
|
|
Average of p-values achieved after calculating the independence level between the probability of each class and the targets, from each classifier.
| Artefact | Benign Gland | Pathological Gland | |
|---|---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Figure 16Automatic prediction of labels for each gland candidate, showing in green and red the labels corresponding to benign and pathological glands, respectively. (a–c) samples of 1024x1024 characterised by presenting a fully benign pattern, according to the diagnosis of an expert on pathological anatomy. (d–f) samples corresponding to a fully pathological pattern. (g–i) samples diagnosed with a combined benign and pathological pattern.
Time average and standard deviation (in seconds) that each process requires to determine the benign or pathological pattern of patches of .
| Healthy Tissues (s) | Cancerous Tissues (s) | |
|---|---|---|
| Clustering stage |
|
|
| Gland segmentation |
|
|
| Feature extraction |
|
|
| Prediction |
|
|
|
|
|
|