| Literature DB >> 31016206 |
Nayana Damiani Macedo1, Aline Rodrigues Buzin1, Isabela Bastos de Araujo2,3, Breno Valentim Nogueira2, Tadeu Uggere Andrade1, Denise Coutinho Endringer1, Dominik Lenz1.
Abstract
BACKGROUND: Manual analysis of tissue sections, such as for pathological diagnosis, requires an analyst with substantial knowledge and experience. Reproducible image analysis of biological samples is steadily gaining scientific importance. The aim of the present study was to employ image analysis followed by machine learning to identify vascular endothelial growth factor (VEGF) in kidney tissue that had been subjected to hypoxia.Entities:
Mesh:
Substances:
Year: 2019 PMID: 31016206 PMCID: PMC6444260 DOI: 10.1155/2019/7232781
Source DB: PubMed Journal: J Immunol Res ISSN: 2314-7156 Impact factor: 4.818
Pipeline programmed for image analysis with CP.
| Module | Operation |
|---|---|
| (1) LoadImages | Identify and load images in .tiff |
| (2) ColorToGray | Conversion method: split |
| (3) Morph | Operation: invert |
| (4) IdentifyPrimaryObjects | (a) Identify an object of interest: core |
| (5) IdentifySecondaryObjects | (a) Object name: cell |
| (6) IdentifyTertiaryObjects | Object name: cytoplasm |
| (7) MeasureObjectSizeShape | Measurement object: cytoplasm |
| (8) MeasureObjectIntensity | Measurement object: cytoplasm |
(1) Load user-defined images. (2) Convert the original images to grayscale images. (3) Invert intensities to have bright nuclei. (4) Identify the primary object of interest (in this case, the nucleus). (5) Identify the secondary object (in this case, the entire cell). (6) Create the tertiary object (cytoplasm) by subtracting the primary object from the secondary object, i.e., subtracting the nucleus from the cell. (7) Analyze morphologic parameters in the object called cytoplasm. (8) Calculate intensity parameters in the object called cytoplasm.
Figure 1Identification of the objects: the pipeline of CellProfiler. (a) Original image. (b) Image (a) converted to grayscale. (c) Image (b) with inverted intensities. (d) Identified nuclei. (e) Identified nuclei. Different colors indicate different objects. (f) Identified cells. Different colors indicate different objects. (g) Identified cytoplasm (i.e., cellular area minus nuclear area). Different colors indicate different objects.
Figure 2Interface of CellProfiler Analyst. Objects on the left side of the training set were classified as VEGF−, and objects on the right side were considered VEGF+.
Figure 3Progress of machine learning.
Figure 4“Score image” classifier of CellProfiler Analyst. (a) Original image stained with Harris hematoxylin and DAB. Orange indicates VEGF+ cells. (b) Classification of the “score image” tool. Objects with blue points are classified as VEGF+, and objects with orange points are identified as VEGF−.
Results of the machine learning-based classification in terms of sensitivity and specificity.
| Sample ID | Sensitivity | Specificity | % VEGF+ | % VEGF− |
|---|---|---|---|---|
| 1 | 0.86 | 0.92 | 69% | 31% |
| 2 | 0.95 | 0.88 | 79% | 21% |
| 3 | 0.98 | 0.89 | 75% | 25% |
| 4 | 1 | 0.84 | 83% | 17% |
| 5 | 0.97 | 0.87 | 59% | 41% |
| 6 | 0.99 | 0.91 | 83% | 17% |
| 7 | 1 | 0.81 | 73% | 27% |
Figure 5The Bland-Altman test to compare the manual and automated counts.
Figure 6ROC curve to evaluate the machine learning process.