| Literature DB >> 32101448 |
William Rogers1,2, Sithin Thulasi Seetha1,2, Turkey A G Refaee1,3, Relinde I Y Lieverse1, Renée W Y Granzier4,5, Abdalla Ibrahim1,4,6,7, Simon A Keek1, Sebastian Sanduleanu1, Sergey P Primakov1, Manon P L Beuque1, Damiënne Marcus1, Alexander M A van der Wiel1, Fadila Zerka1, Cary J G Oberije1, Janita E van Timmeren1,8,9, Henry C Woodruff1,4, Philippe Lambin1,4.
Abstract
Historically, medical imaging has been a qualitative or semi-quantitative modality. It is difficult to quantify what can be seen in an image, and to turn it into valuable predictive outcomes. As a result of advances in both computational hardware and machine learning algorithms, computers are making great strides in obtaining quantitative information from imaging and correlating it with outcomes. Radiomics, in its two forms "handcrafted and deep," is an emerging field that translates medical images into quantitative data to yield biological information and enable radiologic phenotypic profiling for diagnosis, theragnosis, decision support, and monitoring. Handcrafted radiomics is a multistage process in which features based on shape, pixel intensities, and texture are extracted from radiographs. Within this review, we describe the steps: starting with quantitative imaging data, how it can be extracted, how to correlate it with clinical and biological outcomes, resulting in models that can be used to make predictions, such as survival, or for detection and classification used in diagnostics. The application of deep learning, the second arm of radiomics, and its place in the radiomics workflow is discussed, along with its advantages and disadvantages. To better illustrate the technologies being used, we provide real-world clinical applications of radiomics in oncology, showcasing research on the applications of radiomics, as well as covering its limitations and its future direction.Entities:
Mesh:
Year: 2020 PMID: 32101448 PMCID: PMC7362913 DOI: 10.1259/bjr.20190948
Source DB: PubMed Journal: Br J Radiol ISSN: 0007-1285 Impact factor: 3.039