Literature DB >> 33224481

Automated cell tracking using StarDist and TrackMate.

Elnaz Fazeli1, Nathan H Roy2, Gautier Follain3,4, Romain F Laine5,6, Lucas von Chamier5, Pekka E Hänninen1, John E Eriksson3,4, Jean-Yves Tinevez7, Guillaume Jacquemet3,4.   

Abstract

The ability of cells to migrate is a fundamental physiological process involved in embryonic development, tissue homeostasis, immune surveillance, and wound healing. Therefore, the mechanisms governing cellular locomotion have been under intense scrutiny over the last 50 years. One of the main tools of this scrutiny is live-cell quantitative imaging, where researchers image cells over time to study their migration and quantitatively analyze their dynamics by tracking them using the recorded images. Despite the availability of computational tools, manual tracking remains widely used among researchers due to the difficulty setting up robust automated cell tracking and large-scale analysis. Here we provide a detailed analysis pipeline illustrating how the deep learning network StarDist can be combined with the popular tracking software TrackMate to perform 2D automated cell tracking and provide fully quantitative readouts. Our proposed protocol is compatible with both fluorescent and widefield images. It only requires freely available and open-source software (ZeroCostDL4Mic and Fiji), and does not require any coding knowledge from the users, making it a versatile and powerful tool for the field. We demonstrate this pipeline's usability by automatically tracking cancer cells and T cells using fluorescent and brightfield images. Importantly, we provide, as supplementary information, a detailed step-by-step protocol to allow researchers to implement it with their images. Copyright:
© 2020 Fazeli E et al.

Entities:  

Keywords:  Automated tracking; Cell migration; Deep-learning; Image analysis; StarDist; TrackMate

Year:  2020        PMID: 33224481      PMCID: PMC7670479          DOI: 10.12688/f1000research.27019.1

Source DB:  PubMed          Journal:  F1000Res        ISSN: 2046-1402


Introduction

The study of cell motility typically involves recording cell behavior, using live-cell imaging, and tracking their movement over time [1, 2]. To enable the analysis of such data, various software solutions have been developed [3– 9]. However, despite the availability of these computational tools, manual tracking remains widely used among researchers due to the difficulty in setting up fully automated cell tracking analysis pipelines. Automated tracking pipelines share a typical workflow that starts with a segmentation strategy that identifies the objects to track in each image. Tracking algorithms are then used to link these objects between frames. One challenging aspect of an automated tracking pipeline is often achieving an accurate segmentation of the objects to track. One option to facilitate cell segmentation is to label their nuclei, using fluorescent dyes or protein markers. Nuclei can then be automatically segmented using intensity-based thresholding. However, this approach tends to become inaccurate when images are noisy or when the cells to track are very crowded [10]. Deep-Learning approaches have demonstrated their robustness against these two issues [11]. In this work, we present a new analysis workflow that builds upon a Deep-Learning segmentation tool and a cell tracking tool to achieve robust cell tracking in cell migration assays. We combine StarDist, a powerful deep learning-based segmentation tool, and TrackMate, a user-friendly tracking tool, into a tracking pipeline that can be used without requiring expertise in or specialized hardware for computing ( Figure 1) [12– 15].
Figure 1.

Workflow depicting how StarDist and TrackMate can be combined to track cells automatically.

Methods

Pipeline

The use of deep learning networks, such as StarDist, often requires the user to train or retrain a model using their images. While high-quality StarDist pre-trained models are readily available, they are likely to underperform when used on different data with, e.g., different staining, noise, and microscope type [15]. To train StarDist models, we took advantage of the ZeroCostDL4Mic platform, allowing researchers to train (and retrain), validate, and use deep learning networks [15]. Importantly, the ZeroCostDL4Mic StarDist 2D notebook can directly output a file containing all the nuclei's geometric center coordinates (named tracking files), that can be used as input for TrackMate ( Figure 1). Therefore, our proposed pipeline can be divided into three parts ( Figure 1; Extended data [16]). 1) First, a StarDist model is trained using the ZeroCostDL4Mic platform. This part needs to be performed only once for each type of data. 2) Second, the trained StarDist model is used to segment the object to track and generate Tracking files. 3) Finally, the tracking files can be used in TrackMate to track the identified objects. Training a StarDist model requires a set of images and their corresponding masks ( Figure 1 and Figure 2). Generating a training dataset is by far the most time-consuming part of the analysis pipeline presented here as it requires the manual annotations of the images to analyze ( Extended data: Supplementary protocol [16]). For instance, to generate the training datasets presented in Figure 2, each cell/nuclei contour was drawn manually using the freehands selection tool in Fiji. The creation of a high-quality training dataset is a critical part of the process as it will impact the specificity and performance of the StarDist model. However, the generation of a training dataset is only required once per dataset type. If a StarDist model already exists for similar images it can be used to significantly accelerate the creation of the training dataset via semi-automated annotation (see Extended data: Supplementary protocol [16]).
Figure 2.

Example of datasets analyzed using StarDist and TrackMate.

( A, B) Migration of MCF10DCIS.com, labeled with Sir-DNA, recorded using a spinning disk confocal microscope and automatically tracked. Examples of images used to train StarDist ( A), and an example of results obtained using automated tracking are displayed ( B, Video 1). The yellow square indicates a magnified ROI, where the local track of each nucleus is displayed. The full cell tracks are displayed on the left. Tracks are color-coded as a function of their maximum instantaneous velocity (blue slow, red fast tracks). ( C– E) Migration of activated T cell plated on VCAM-1 or ICAM-1, recorded using a brightfield microscope and automatically tracked. Examples of images used to train StarDist ( C) and an example of results obtained using automated tracking are displayed ( D, Video 2). ( E) Comparison of the migration of activated T cells on VCAM-1 or ICAM-1. Track mean speed and track straightness were quantified. Data are displayed as boxplots. *** p-value = <0.001, p-values were determined using a randomization test. ( F, G) Cancer cells flowing in a microfluidic chamber, recorded live using a brightfield microscope and automatically tracked (Video 3). Examples of images used to train StarDist ( F), and an example of results obtained using automated tracking are displayed ( G). The full tracks shown here were color-coded as a function of their x coordinate.

Example of datasets analyzed using StarDist and TrackMate.

( A, B) Migration of MCF10DCIS.com, labeled with Sir-DNA, recorded using a spinning disk confocal microscope and automatically tracked. Examples of images used to train StarDist ( A), and an example of results obtained using automated tracking are displayed ( B, Video 1). The yellow square indicates a magnified ROI, where the local track of each nucleus is displayed. The full cell tracks are displayed on the left. Tracks are color-coded as a function of their maximum instantaneous velocity (blue slow, red fast tracks). ( C– E) Migration of activated T cell plated on VCAM-1 or ICAM-1, recorded using a brightfield microscope and automatically tracked. Examples of images used to train StarDist ( C) and an example of results obtained using automated tracking are displayed ( D, Video 2). ( E) Comparison of the migration of activated T cells on VCAM-1 or ICAM-1. Track mean speed and track straightness were quantified. Data are displayed as boxplots. *** p-value = <0.001, p-values were determined using a randomization test. ( F, G) Cancer cells flowing in a microfluidic chamber, recorded live using a brightfield microscope and automatically tracked (Video 3). Examples of images used to train StarDist ( F), and an example of results obtained using automated tracking are displayed ( G). The full tracks shown here were color-coded as a function of their x coordinate. One of our analysis pipeline's key features is that, once a StarDist model has been satisfactorily trained, movies of migrating cells can efficiently be processed in batch. Indeed, while individual tracking files can be analyzed one by one using the TrackMate graphical interface, we also provide a Fiji macro to analyze a folder containing multiple tracking files. Our batch processing macro will provide basic quantitative information for each track, including median and maximal speeds. If more information is needed, the tracking results generated by our script are directly compatible with the Motility lab website, where they can be further processed [17].

Implementation and operation

The described image analysis pipeline is composed of a Jupiter notebook optimized to run in Google Colab (ZeroCostDL4Mic framework [15]) and a Python script that can run in Fiji [14]. A step-by-step protocol describing how to use our analysis pipeline is provided as Extended data [16].

Use case

To illustrate our analysis pipeline's functionality and flexibility, we first trained a StarDist model to analyze the behavior of breast cancer cells migrating collectively ( Figure 2A; Extended data: Video 1 [16]). The cancer cell's nuclei were fluorescently labeled, and the cells imaged using fluorescence-based microscopy. The creation of the training dataset used in this example was greatly facilitated by the availability of a StarDist model, released by the StarDist creators, capable of segmenting fluorescent nuclei. In this case, the StarDist Fiji plugin was used to segment the location of nuclei in the training images, and all miss-annotations were manually corrected ( Extended data: Supplementary protocol [16]).

Video 1: Automated tracking of breast cancer cell migrating collectively

Video 1: Automated tracking of breast cancer cell migrating collectively. MCF10DCIS.com cells, labeled with Sir-DNA, were recorded using a spinning disk confocal microscope and automatically tracked using StarDist and TrackMate. Local tracks are displayed. Click here for additional data file. To highlight that our pipeline can also be used to analyze brightfield images, we generated a StarDist model to track T cells migrating on ICAM-1 or VCAM-1 ( Figure 2C–E; Extended data: Video 2 [16]). Importantly, automated analysis of these data could reproduce the results obtained via manual tracking [19].

Video 2: Automated tracking of T cell migrating on ICAM-1

Video 2: Automated tracking of T cell migrating on ICAM-1. Activated T cell plated ICAM-1 were recorded using a brightfield microscope and automatically tracked using StarDist and TrackMate. Local tracks are displayed. Click here for additional data file. Finally, we used our pipeline to automatically track non-adherent cancer cells flowing in a microfluidic chamber ( Figure 2F and G; Extended data: Video 3 [16]). In this case, automated tracking is especially useful due to the very high number of frames to analyze. For the last two examples, no suitable pre-trained StarDist models were available. Therefore, to generate the training datasets, we manually annotated 20 images and trained a first StarDist model. This model was then used to accelerate the annotation of the rest of the training images.

Video 3: Automated tracking of cancer cells flowing in a microfluidic chamber

Video 3: Automated tracking of cancer cells flowing in a microfluidic chamber. AsPC1 pancreatic cancer cells flowing in a microfluidic chamber were recorded live using a brightfield microscope and automatically tracked using StarDist and TrackMate. Local tracks are displayed. Click here for additional data file.

Use case dataset creation

MCF10DCIS.com cells were described previously [15, 22]. DCIS.COM lifeact-RFP cells were incubated for 2h with 0.5 µM SiR-DNA (SiR-Hoechst, Tetu-bio, Cat Number: SC007) before being imaged live for 14 h using a spinning-disk confocal microscope (1 picture every 10 min). The spinning-disk confocal microscope used was a Marianas spinning disk imaging system with a Yokogawa CSU-W1 scanning unit on an inverted Zeiss Axio Observer Z1 microscope (Intelligent Imaging Innovations, Inc.) equipped with a 20x (NA 0.8) air, Plan Apochromat objective (Zeiss). Lab-Tek 8 chamber slides (ThermoFisher) were coated with 2 μg/mL ICAM-1 or VCAM-1 overnight at 4°C [19]. Activated primary mouse CD4+ T cells were washed and resuspended in L-15 media containing 2 mg/mL D-glucose. T cells were then added to the chambers, incubated 20 min, gently washed to remove all unbound cells, and imaged. Imaging was done using a 10x phase contrast objective at 37°C on a Zeiss Axiovert 200M microscope equipped with an automated X-Y stage and a Roper EMCCD camera. Time-lapse images were collected every 30 sec for 10 min using SlideBook 6 software (Intelligent Imaging Innovations). Cancer cells (500,000 cells/ml in PBS) were perfused at a speed of 300 µm/sec using a peristaltic pump (ISMATEC MS12/4 analogic) and a homemade tubing system (Ismatek 3-Stop tubes and Ibidi ® tubings and connectors) in a microchannel (Ibidi ® µ-slides400 LUER). Images were acquired with a brightfield microscope (Zeiss Laser-TIRF 3 Imaging System, Carl Zeiss) and a 10X objective.

Data display and statistical analyses

Box plots were generated using PlotsOfData [23]. Randomization tests were performed using the online tool PlotsOfDifferences [24].

Conclusions

Here we show that StarDist and TrackMate can be integrated seamlessly and robustly to automate cell tracking in fluorescence and brightfield images. We envision that this pipeline can also be applied to any circular or oval-shaped objects. However, we acknowledge that using brightfield images may not always work directly with our pipeline, especially if cells display complex and interchanging shapes, since StarDist is mostly designed to detect round or compact shapes. In this case, other tools, such as Usiigaci, could also be considered [8]. Still, brightfield images could also be artificially labeled using deep learning, transforming the brightfield dataset into a pseudo-fluorescence one, as can be done with ZeroCostDL4Mic already [15]. The pipeline described here is currently limited to the tracking of objects in 2D. However, a similar workflow can be applied to 3D datasets as both StarDist and TrackMate can accommodate 3D images [12, 13, 25].

Data availability

Underlying data

Zenodo: Combining StarDist and TrackMate example 1 - Breast cancer cell dataset, http://doi.org/10.5281/zenodo.4034976 [26] Zenodo: Combining StarDist and TrackMate example 2 - T cell dataset, http://doi.org/10.5281/zenodo.4034929 [27] Zenodo: Combining StarDist and TrackMate example 3 - Flow chamber dataset, http://doi.org/10.5281/zenodo.4034939 [28]

Extended data

Zenodo: Combining StarDist and TrackMate - Extended data, http://doi.org/10.5281/zenodo.4091467 [16]. This project contains the following extended data: Supplementary protocol Video 1: Automated tracking of breast cancer cell migrating collectively. MCF10DCIS.com cells, labeled with Sir-DNA, were recorded using a spinning disk confocal microscope and automatically tracked using StarDist and TrackMate. Local tracks are displayed. Video 2: Automated tracking of T cell migrating on ICAM-1. Activated T cell plated ICAM-1 were recorded using a brightfield microscope and automatically tracked using StarDist and TrackMate. Local tracks are displayed. Video 3: Automated tracking of cancer cells flowing in a microfluidic chamber. AsPC1 pancreatic cancer cells flowing in a microfluidic chamber were recorded live using a brightfield microscope and automatically tracked using StarDist and TrackMate. Local tracks are displayed. Data are available under the terms of the Creative Commons Attribution 4.0 International license (CC-BY 4.0).

Software availability

Source code available from: https://github.com/HenriquesLab/ZeroCostDL4Mic Archived source code at time of publication: http://doi.org/10.5281/zenodo.4091474 [26] License: MIT license. In this article, Fazeli et al. combine deep-learning segmentation tools with open access cell tracking platforms to show the feasibility for automated cell tracking and large population motility analysis. Using this approach, the authors here clearly describe and validate this process in fluorescent movies of collective cell migration as well as single cell bright-field cell migration datasets. While the description of methods is very clear (and special kudos to the authors for transparency and availability), and its applicability very apparent, a few other potential points to consider are as follows: A comparison on a given dataset between trackmate with StarDist based segmentation versus canonical segmentation methods of the nuclei. How much better does it perform? This could be in terms of accuracy/precision or time taken to analyze. Related to point 1: The power of StarDist based nuclear segmentation is its ability in crowded environments as well as where there is poor signal in the fluorescent channels. Is there a way to explicitly show that when there is poor signal or bleaching during a movie, StarDist combined with trackmate does a better job? One of the hardest objects to segment are bright-field objects which is why the field has been relying on hand tracking of cell migration movies taken with bright-field. I think the authors should underscore this and highlight that this method is overcoming this big challenge enabling larger population analysis. Are the conclusions about the tool and its performance adequately supported by the findings presented in the article? Yes Is the rationale for developing the new software tool clearly explained? Yes Is the description of the software tool technically sound? Yes Are sufficient details of the code, methods and analysis (if applicable) provided to allow replication of the software development and its use by others? Yes Is sufficient information provided to allow interpretation of the expected output datasets and any results generated using the tool? Yes Reviewer Expertise: Cell migration, mechanobiology, Cancer. I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard. The authors propose the combination of open-source tools (namely StarDist and TrackMate) for the automatic tracking of cells in fluorescence and brightfield images in 2D. Moreover, they provide a step-by-step workflow to process videos in a batch mode using exclusively free tools. They evaluate its performance by comparing the results obtained using such a workflow with those of manual tracking from an already published public dataset. The paper is very well written, concise and straight to the point, with a special emphasis on reproducibility. As pointed out in the conclusions, one of the limitations of the proposed approach is inherent to the type of objects that StarDist can properly segment (basically round). However, that doesn't prevent this pipeline from being extremely useful for a broad spectrum of cell tracking problems. Moreover, the adaptation of the pipeline to 3D images seems pretty straightforward. Something interesting that is not mentioned in the paper is how well the workflow would perform in the presence of cell divisions of apoptosis. That could be easily tested using some of the datasets from the Cell Tracking Challenge (http://celltrackingchallenge.net/2d-datasets/). Minor comments: Please homogenize how you write "deep learning", which appear sometimes as "Deep-Learning" and sometimes as "deep learning". Are the conclusions about the tool and its performance adequately supported by the findings presented in the article? Yes Is the rationale for developing the new software tool clearly explained? Yes Is the description of the software tool technically sound? Yes Are sufficient details of the code, methods and analysis (if applicable) provided to allow replication of the software development and its use by others? Yes Is sufficient information provided to allow interpretation of the expected output datasets and any results generated using the tool? Yes Reviewer Expertise: Computer Vision, Bioimage Analysis. I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above. This article presents a pipeline for analyzing cell migration in a variety of contexts by combining several complimentary techniques. Utilising stardist for cell detection (provided cells are round/have nuclei staining), and trackmate for connecting the detected nuclei over time - a start-to-finish protocol is described allowing a microscopist with little image analysis knowledge to be able to quantify their experiment. The authors do an admirable job of describing the required steps of the analysis pipeline, including an introduction to Jupyter Notebooks and the ZeroCostDL4Mic workflows to train a custom Stardist model. They also provide a FIJI macro for batch analysis, potentially saving a researcher many hours of human analysis time. Indeed, as all of these methods are published and validated already my only (small) criticism has to do with the training of the stardist models. As the authors rightly note, this is the most time consuming part of the analysis and as a result I would have liked to see some mention of how much manual time is required. For instance, while the article suggests training a small 20 image dataset and then using transfer learning to speed up the remaining annotation, the number of remaining images is not described. While the amount of training required is likely to vary across experiments, I think readers would benefit from knowing when considering this pipeline how many cells (rather than image fields of view) they are likely to be required to manually annotate. Perhaps for each dataset where a model was trained the authors could specify the size of the training dataset (in both images and number of cells). Overall, this is a clear description of several powerful tools being combined into a very useful and versatile workflow. Are the conclusions about the tool and its performance adequately supported by the findings presented in the article? Yes Is the rationale for developing the new software tool clearly explained? Yes Is the description of the software tool technically sound? Yes Are sufficient details of the code, methods and analysis (if applicable) provided to allow replication of the software development and its use by others? Yes Is sufficient information provided to allow interpretation of the expected output datasets and any results generated using the tool? Yes Reviewer Expertise: bioimage analysis I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.
  15 in total

1.  Fiji: an open-source platform for biological-image analysis.

Authors:  Johannes Schindelin; Ignacio Arganda-Carreras; Erwin Frise; Verena Kaynig; Mark Longair; Tobias Pietzsch; Stephan Preibisch; Curtis Rueden; Stephan Saalfeld; Benjamin Schmid; Jean-Yves Tinevez; Daniel James White; Volker Hartenstein; Kevin Eliceiri; Pavel Tomancak; Albert Cardona
Journal:  Nat Methods       Date:  2012-06-28       Impact factor: 28.547

2.  TrackMate: An open and extensible platform for single-particle tracking.

Authors:  Jean-Yves Tinevez; Nick Perry; Johannes Schindelin; Genevieve M Hoopes; Gregory D Reynolds; Emmanuel Laplantine; Sebastian Y Bednarek; Spencer L Shorte; Kevin W Eliceiri
Journal:  Methods       Date:  2016-10-03       Impact factor: 3.608

3.  Automated Tracking of Cell Migration with Rapid Data Analysis.

Authors:  Brian J DuChez
Journal:  Curr Protoc Cell Biol       Date:  2017-09-01

4.  Rac1 is deactivated at integrin activation sites through an IQGAP1-filamin-A-RacGAP1 pathway.

Authors:  Guillaume Jacquemet; Mark R Morgan; Adam Byron; Jonathan D Humphries; Colin K Choi; Christopher S Chen; Patrick T Caswell; Martin J Humphries
Journal:  J Cell Sci       Date:  2013-07-10       Impact factor: 5.285

5.  Automated cell tracking and analysis in phase-contrast videos (iTrack4U): development of Java software based on combined mean-shift processes.

Authors:  Fabrice P Cordelières; Valérie Petit; Mayuko Kumasaka; Olivier Debeir; Véronique Letort; Stuart J Gallagher; Lionel Larue
Journal:  PLoS One       Date:  2013-11-27       Impact factor: 3.240

6.  Objective comparison of particle tracking methods.

Authors:  Nicolas Chenouard; Ihor Smal; Fabrice de Chaumont; Martin Maška; Ivo F Sbalzarini; Yuanhao Gong; Janick Cardinale; Craig Carthel; Stefano Coraluppi; Mark Winter; Andrew R Cohen; William J Godinez; Karl Rohr; Yannis Kalaidzidis; Liang Liang; James Duncan; Hongying Shen; Yingke Xu; Klas E G Magnusson; Joakim Jaldén; Helen M Blau; Perrine Paul-Gilloteaux; Philippe Roudot; Charles Kervrann; François Waharte; Jean-Yves Tinevez; Spencer L Shorte; Joost Willemse; Katherine Celler; Gilles P van Wezel; Han-Wei Dan; Yuh-Show Tsai; Carlos Ortiz de Solórzano; Jean-Christophe Olivo-Marin; Erik Meijering
Journal:  Nat Methods       Date:  2014-01-19       Impact factor: 28.547

7.  In vitro Cell Migration, Invasion, and Adhesion Assays: From Cell Imaging to Data Analysis.

Authors:  Jordi Pijuan; Carla Barceló; David F Moreno; Oscar Maiques; Pol Sisó; Rosa M Marti; Anna Macià; Anaïs Panosa
Journal:  Front Cell Dev Biol       Date:  2019-06-14

8.  Evaluation of Deep Learning Strategies for Nucleus Segmentation in Fluorescence Images.

Authors:  Juan C Caicedo; Jonathan Roth; Allen Goodman; Tim Becker; Kyle W Karhohs; Matthieu Broisin; Csaba Molnar; Claire McQuin; Shantanu Singh; Fabian J Theis; Anne E Carpenter
Journal:  Cytometry A       Date:  2019-07-16       Impact factor: 4.355

9.  L-type calcium channels regulate filopodia stability and cancer cell invasion downstream of integrin signalling.

Authors:  Guillaume Jacquemet; Habib Baghirov; Maria Georgiadou; Harri Sihto; Emilia Peuhu; Pierre Cettour-Janet; Tao He; Merja Perälä; Pauliina Kronqvist; Heikki Joensuu; Johanna Ivaska
Journal:  Nat Commun       Date:  2016-12-02       Impact factor: 14.919

10.  CellProfiler 3.0: Next-generation image processing for biology.

Authors:  Claire McQuin; Allen Goodman; Vasiliy Chernyshev; Lee Kamentsky; Beth A Cimini; Kyle W Karhohs; Minh Doan; Liya Ding; Susanne M Rafelski; Derek Thirstrup; Winfried Wiegraebe; Shantanu Singh; Tim Becker; Juan C Caicedo; Anne E Carpenter
Journal:  PLoS Biol       Date:  2018-07-03       Impact factor: 8.029

View more
  7 in total

1.  A single-cell atlas of the normal and malformed human brain vasculature.

Authors:  Ethan A Winkler; Chang N Kim; Jayden M Ross; Joseph H Garcia; Eugene Gil; Irene Oh; Lindsay Q Chen; David Wu; Joshua S Catapano; Kunal Raygor; Kazim Narsinh; Helen Kim; Shantel Weinsheimer; Daniel L Cooke; Brian P Walcott; Michael T Lawton; Nalin Gupta; Berislav V Zlokovic; Edward F Chang; Adib A Abla; Daniel A Lim; Tomasz J Nowakowski
Journal:  Science       Date:  2022-03-04       Impact factor: 63.714

2.  Cargo-specific recruitment in clathrin- and dynamin-independent endocytosis.

Authors:  Paulina Moreno-Layseca; Niklas Z Jäntti; Rashmi Godbole; Christian Sommer; Guillaume Jacquemet; Hussein Al-Akhrass; James R W Conway; Pauliina Kronqvist; Roosa E Kallionpää; Leticia Oliveira-Ferrer; Pasquale Cervero; Stefan Linder; Martin Aepfelbacher; Henrik Zauber; James Rae; Robert G Parton; Andrea Disanza; Giorgio Scita; Satyajit Mayor; Matthias Selbach; Stefan Veltel; Johanna Ivaska
Journal:  Nat Cell Biol       Date:  2021-10-06       Impact factor: 28.213

3.  Democratising deep learning for microscopy with ZeroCostDL4Mic.

Authors:  Lucas von Chamier; Romain F Laine; Johanna Jukkala; Christoph Spahn; Daniel Krentzel; Elias Nehme; Martina Lerche; Sara Hernández-Pérez; Pieta K Mattila; Eleni Karinou; Séamus Holden; Ahmet Can Solak; Alexander Krull; Tim-Oliver Buchholz; Martin L Jones; Loïc A Royer; Christophe Leterrier; Yoav Shechtman; Florian Jug; Mike Heilemann; Guillaume Jacquemet; Ricardo Henriques
Journal:  Nat Commun       Date:  2021-04-15       Impact factor: 14.919

4.  Automated Cell Foreground-Background Segmentation with Phase-Contrast Microscopy Images: An Alternative to Machine Learning Segmentation Methods with Small-Scale Data.

Authors:  Guochang Ye; Mehmet Kaya
Journal:  Bioengineering (Basel)       Date:  2022-02-18

Review 5.  Data science in cell imaging.

Authors:  Meghan K Driscoll; Assaf Zaritsky
Journal:  J Cell Sci       Date:  2021-04-01       Impact factor: 5.285

6.  Methodology for comprehensive cell-level analysis of wound healing experiments using deep learning in MATLAB.

Authors:  Jan Oldenburg; Lisa Maletzki; Anne Strohbach; Paul Bellé; Stefan Siewert; Raila Busch; Stephan B Felix; Klaus-Peter Schmitz; Michael Stiehm
Journal:  BMC Mol Cell Biol       Date:  2021-06-02

7.  An N-Cadherin 2 expressing epithelial cell subpopulation predicts response to surgery, chemotherapy and immunotherapy in bladder cancer.

Authors:  Kenneth H Gouin; Nathan Ing; Jasmine T Plummer; Charles J Rosser; Bassem Ben Cheikh; Catherine Oh; Stephanie S Chen; Keith Syson Chan; Hideki Furuya; Warren G Tourtellotte; Simon R V Knott; Dan Theodorescu
Journal:  Nat Commun       Date:  2021-08-12       Impact factor: 14.919

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.