| Literature DB >> 35136860 |
Jintao Li1, Jie Chen2,3, Hua Bai1, Haiwei Wang1, Shiping Hao1, Yang Ding1, Bo Peng1, Jing Zhang4, Lin Li1,5, Wei Huang1,5.
Abstract
Microfluidic-based organs-on-chips (OoCs) are a rapidly developing technology in biomedical and chemical research and have emerged as one of the most advanced and promising in vitro models. The miniaturization, stimulated tissue mechanical forces, and microenvironment of OoCs offer unique properties for biomedical applications. However, the large amount of data generated by the high parallelization of OoC systems has grown far beyond the scope of manual analysis by researchers with biomedical backgrounds. Deep learning, an emerging area of research in the field of machine learning, can automatically mine the inherent characteristics and laws of "big data" and has achieved remarkable applications in computer vision, speech recognition, and natural language processing. The integration of deep learning in OoCs is an emerging field that holds enormous potential for drug development, disease modeling, and personalized medicine. This review briefly describes the basic concepts and mechanisms of microfluidics and deep learning and summarizes their successful integration. We then analyze the combination of OoCs and deep learning for image digitization, data analysis, and automation. Finally, the problems faced in current applications are discussed, and future perspectives and suggestions are provided to further strengthen this integration.Entities:
Year: 2022 PMID: 35136860 PMCID: PMC8795883 DOI: 10.34133/2022/9869518
Source DB: PubMed Journal: Research (Wash D C) ISSN: 2639-5274
Figure 1Integration of deep learning with organs-on-chips (OoCs). Deep learning has been applied to device design, real-time monitoring, and image processing in OoCs. In the future, it may be further applied to organelle tracking, mechanical force mimicking, drug screening, rare disease diagnosis, and human-on-chip regulation (created with http://BioRender.com).
Figure 2Emergence of OoC technology provides a strong connection between animal models and traditional in vitro models. It considers the physiological relevance and complexity as well as throughput and reproducibility (created with http://BioRender.com).
Figure 3Integration of microfluidic technology, biomaterials, and cell biology results in an advanced in vitro OoC system. Cells from a human body are extracted (2) and placed in perfusable microfluidic devices (3) to make OoCs (4). Multiple OoCs connected together results in a human-on-a-chip system, (5) which ultimately will faithfully replicate the key functions of the human body and therefore holds great potential for use in drug discovery and pathological research (created with http://BioRender.com).
Figure 4Some successful applications of OoCs and their corresponding functions (created with http://BioRender.com).
Figure 5Visual relationship among AI, machine learning, and deep learning.
Figure 6In-depth analysis of the development context of deep learning. (a) Biological neuron and M-P model. To simplify the model and facilitate expression, the model ignores complex factors in biology. (b) Monolayer perceptrons. (c) Back propagation algorithm. (d) Convolution neural networks.
Figure 7Some typical deep learning networks. (a) Architecture of LeNet. Each square is a feature map, and the weights of each set of squares are constrained to be identical. (b) Architecture of DBN. The DBN consists of several hidden layers and a visible layer, with connections between the layers, but not between the units of each layer. The hidden layers are trained to capture the correlations of data displayed in the visual layer. (c) Architecture of AlexNet. It is similar to LeNet but replaces large convolution with some convolution (Conv) layers, which means it is deeper than LeNet. In addition to this, it uses ReLU as the activation function and considerably more data than LeNet.
Summary of different applications for deep learning in microfluidics and deep learning in OoCs.
| Application | Experiment | Network | Function | Refs | |||
|---|---|---|---|---|---|---|---|
| Device design | Subject | Input | Architecture | Output | |||
| Deep learning in microfluidics | A microfluidic device with three capillary tubes | The generated microdroplet in a T-junction microfluidic system | Four numbers affect the size of microdroplet | An ANN architecture | Length of the droplet and the diameter of the junction | Predict the size of microdroplet at the exit of the T-junction according to different parameters | [ |
| Two pressure sensors and a single microchannel filled with a liquid metal | Microfluidic soft sensors | An analog voltage | An RNN with an attention module | Pressure estimation and localization | Estimate both pressure magnitude and location while considering the hysteresis problem | [ | |
| A fluid flow shape model decided by micropillars | Flow sculpting | The top-half image of a microchannel shape | A CNN architecture | Corresponding pillar sequences | Make predictions and deliver comparable designs for flow sculpting | [ | |
| Two microfluidic devices with four culture channels |
| Spectrum images that convert from original images via FFT | An AlexNet architecture | Pixel count in the spectrum images | Recognize the regional concentration change of the cultured bacteria | [ | |
| A plastic slide with physical channels in medium | Bone marrow from mice tibiae and ilia | Long-term and time-lapse microscopy cell patches | A CNN-RNN architecture | Cell lineage score | Predict the lineage choice of stem cells' progeny | [ | |
|
| |||||||
| Deep learning in OoCs | A microfluidic device composed of a central immune chamber and two tumor chambers | Interferon- | Time-lapse images that record cells' trajectory in 3D tumor spaces | An unsupervised image analysis algorithm, Cell Hunter | Parameters that characterize IFN-DC behavior toward cancer cells | Track immune cell-tumor interactions in real time | [ |
| A microfluidic device composed of six reservoirs and four chambers | Three groups of human PBMCs | Data that collect by a microfluidic platform and time-lapse video | Cell Hunter | Some trajectories of specific cell | Track the migration and the interactions of human PBMCs toward tumor cells | [ | |
| A microfluidic device with 3D biomimetic hydrogels inside microchambers | HER2+ breast cancer BT474 cell line and PBMCs | An atlas of videos at varying spatial-temporal resolutions | Cell Hunter | A set of kinematic and interaction descriptors | Describe the motility and interaction at varying spatial-temporal resolutions | [ | |
| A 3D coculture microfluidic device with a central vascular compartment and two lateral chambers | BT474 cell line of HER2+ breast cancer, the breast CAF cell line Hs578T and PBMCs | Time-lapse videos and images that reconstructed in 3D | Cell Hunter | Parameters that record the interaction of a single cancer cell with all the PBMCs | Characterize the responses to the drug and dissect the roles of immune cells and fibroblasts | [ | |
| A microfluidic device with 3D biomimetic gels | BT474 cell line of HER2+ breast cancer and PBMCs | Video sequence of cells | Cell Hunter and a CNN architecture | Atlas of experimental cell tracks and type | Discover hidden messages within cell trajectories for cancer drug treatments | [ | |
| A stretchable micropatterned 3D human skeletal muscle platform | Human skeletal muscle cells and myogenic stem cells | Morphological image of skeletal muscle cells | A CNN-RNN architecture | Temporal prediction and cell function of muscle cells | Judge the physiological status, contractile type and performance of muscle cells more easily | [ | |
Figure 8Deep learning in device parameters. (a) Formation mechanism of microdroplets in a microfluidic T-junction. The dispersed phase is perpendicular to the lateral channel. Two syringe pumps supply and control two fluids. The neural network below consists of 10 neurons in the hidden layer (reproduced with permission from Ref. [71]). (b) Hierarchical signal-level recurrent network, which could concurrently learn to forecast pressure and location. Reproduced with permission from Ref. [72]. (c) Workflow for the trained deep learning network (reproduced with permission from Ref. [73]).
Figure 9Deep learning in images. (a) CNN structure estimated bacterial growth in microfluidic channels (reproduced with permission from Ref. [91]). (b) CNN architecture combined with an RNN architecture, which could use the temporal information of a single-cell track to choose the lineage of a stem cell's progeny automatically (reproduced with permission from Ref. [92]).
Figure 10Deep learning in OoCs. (a) 3D schematic device for real-time monitoring of cell interactions (reproduced with permission from Ref. [94]). (b) General scheme of the tumor-on-a-chip consists of six reservoirs for culture medium replacement and four chambers for cell culture (reproduced with permission from Ref. [96]). (c) Definition of the stochastic particle interaction model. The physical interactions among cells were performed through repulsion–attraction exchanges. For the immune cell–cancer interaction, they imposed an attraction in the proximity of the target cell (reproduced with permission from Ref. [97]).
Figure 11Deep learning in OoCs. (a) Tumor-on-a-chip approach. A central vascular compartment made of a monolayer of endothelial cells (pink), lateral chambers with 3D collagen hydrogels (gray) in which cancer cells (green), immune cells (blue), and CAFs (red) were embedded (reproduced with permission from Ref. [98]). (b) Schematic representation of the proposed method (reproduced with permission from Ref. [99]). (c) Differential expansion microscopy (DExM) was fed into an RNN architecture with LSTM memory blocks to predict the morphology of muscle cells and gain temporal DExM images. These images were then used by a CNN architecture for the prediction of cell function (reproduced with permission from Ref. [100]).
Figure 12Key applications of deep learning in OoCs are prediction, target recognition, image segmentation, and tracking.