| Literature DB >> 30565841 |
Anindya Gupta1, Philip J Harrison2, Håkan Wieslander1, Nicolas Pielawski1, Kimmo Kartasalo3,4, Gabriele Partel1, Leslie Solorzano1, Amit Suveer1, Anna H Klemm1,5, Ola Spjuth2, Ida-Maria Sintorn1, Carolina Wählby1,5.
Abstract
Artificial intelligence, deep convolutional neural networks, and deep learning are all niche terms that are increasingly appearing in scientific presentations as well as in the general media. In this review, we focus on deep learning and how it is applied to microscopy image data of cells and tissue samples. Starting with an analogy to neuroscience, we aim to give the reader an overview of the key concepts of neural networks, and an understanding of how deep learning differs from more classical approaches for extracting information from image data. We aim to increase the understanding of these methods, while highlighting considerations regarding input data requirements, computational resources, challenges, and limitations. We do not provide a full manual for applying these methods to your own data, but rather review previously published articles on deep learning in image cytometry, and guide the readers toward further reading on specific networks and methods, including new methods not yet applied to cytometry data.Entities:
Keywords: biomedical image analysis; cell analysis; convolutional neural networks; deep learning; image cytometry; machine learning; microscopy
Year: 2018 PMID: 30565841 PMCID: PMC6590257 DOI: 10.1002/cyto.a.23701
Source DB: PubMed Journal: Cytometry A ISSN: 1552-4922 Impact factor: 4.355
Figure 1Overview of conventional versus deep learning workflows. The human in the center provides input in the form of, for example, parameter tuning and feature engineering in each step of the conventional workflow (black dashed arrows) using annotated data. Conversely, the deep learning workflow requires only annotated data to optimize features automatically. Annotated data is a key component of supervised deep learning as illustrated in the example classification workflow. Other example tasks, as discussed in the text, follow a similar pattern. The example image was provided by the Broad bioimage benchmark collection.
Figure 2An example of an input image Ι7x7 convolved with a filter k3x3 with weights of zeros and ones to encode a representation (feature map). The receptive field is highlighted in pink and the corresponding output value for the position is marked in green. [Color figure can be viewed at wileyonlinelibrary.com]
Figure 3The sub‐sampled output of a max‐pooling operation with a stride of 2 applied on an input image (I). [Color figure can be viewed at wileyonlinelibrary.com]
Figure 4Learning process of a DNN. (a) A dense layer with an input layer where all the encoded representations from the previous layers are fully connected to the next layers. (b) Zoomed‐in view of an example neuron showing the forward propagation to compute the output ȳ, where the non‐negative activations are defined using the ReLU. (c) Gradient‐decent based optimization of the loss function in a forward/backward propagation. [Color figure can be viewed at wileyonlinelibrary.com]
Figure 5An example of a skip connection, connecting the input with the output of one convolution block (consisting of a convolutional layer, a batch normalization layer, and a ReLU activation function).
Figure 6An infographic as a guide to help the readers find articles of interest in relation to a specific task, sample type, or imaging modality, each number in the graphic matching with the corresponding reference. An interactive version linking directly to the source articles can be found at https://anindgupta.github.io/cyto_review_paper.github.io. (138–338) [Color figure can be viewed at wileyonlinelibrary.com]