| Literature DB >> 31142829 |
Yair Rivenson1,2,3, Hongda Wang4,5,6, Zhensong Wei4, Kevin de Haan4,5,6, Yibo Zhang4,5,6, Yichen Wu4,5,6, Harun Günaydın4, Jonathan E Zuckerman7, Thomas Chong7, Anthony E Sisk7, Lindsey M Westbrook7, W Dean Wallace7, Aydogan Ozcan8,9,10,11.
Abstract
The histological analysis of tissue samples, widely used for disease diagnosis, involves lengthy and laborious tissue preparation. Here, we show that a convolutional neural network trained using a generative adversarial-network model can transform wide-field autofluorescence images of unlabelled tissue sections into images that are equivalent to the bright-field images of histologically stained versions of the same samples. A blind comparison, by board-certified pathologists, of this virtual staining method and standard histological staining using microscopic images of human tissue sections of the salivary gland, thyroid, kidney, liver and lung, and involving different types of stain, showed no major discordances. The virtual-staining method bypasses the typically labour-intensive and costly histological staining procedures, and could be used as a blueprint for the virtual staining of tissue images acquired with other label-free imaging modalities.Entities:
Mesh:
Substances:
Year: 2019 PMID: 31142829 DOI: 10.1038/s41551-019-0362-y
Source DB: PubMed Journal: Nat Biomed Eng ISSN: 2157-846X Impact factor: 25.671