| Literature DB >> 30863306 |
Arianna Mencattini1, Fabrizio Mattei2, Giovanna Schiavoni2, Annamaria Gerardino3, Luca Businaro3, Corrado Di Natale1, Eugenio Martinelli1.
Abstract
The increasing interest for microfluidic devices in medicine and biology has opened the way to new time-lapse microscopy era where the amount of images and their acquisition time will become crucial. In this optic, new data analysis algorithms have to be developed in order to extract novel features of cell behavior and cell-cell interactions. In this brief article, we emphasize the potential strength of a new paradigm arising in the integration of microfluidic devices (i.e., organ on chip), time-lapse microscopy analysis, and machine learning approaches. Some snapshots of previous case studies in the context of immunotherapy are included as proof of concepts of the proposed strategies while a visionary description concludes the work foreseeing future research and applicative scenarios.Entities:
Keywords: cell interaction analysis; image analysis; machine learning; organ on chip; time-lapse microscopy
Year: 2019 PMID: 30863306 PMCID: PMC6399655 DOI: 10.3389/fphar.2019.00100
Source DB: PubMed Journal: Front Pharmacol ISSN: 1663-9812 Impact factor: 5.810
FIGURE 1Scheme of a high throughput platform for the advanced study and reproduction of the tumor microenvironment. The microfluidic device is manufactured ad hoc, according to the biological experiment requirements. Then, the desired cell subsets are loaded into the chip together with tumor cells, to an extent to propose a simplified version of the tumor microenvironment. Time-lapse microscopy is used to acquire the high-resolution frames of the whole video sequence. Microscopy setting is functionalized by the scale of the objects of interest and the duration of the time-lapse. Cells are then automatically localized and tracked across each frame of the video sequence and trajectories are characterized in terms of individual and aggregated kinematics and morphological descriptors. At this point, specifically developed machine learning algorithms are then applied to recognize patterns for biological reasoning. For example, cell tracking datasets are then clustered into separated groups reflecting distinct cell behaviors. The same kinematics and morphological descriptors can be used as input for in silico models aimed at simulating on-chip experiments.