Sara Moccia1,2, Leonardo S Mattos3, Ilaria Patrini4, Michela Ruperti4, Nicolas Poté5,6, Federica Dondero7, François Cauchy7, Ailton Sepulveda7, Olivier Soubrane7, Elena De Momi4, Alberto Diaspro8, Manuela Cesaretti7,8. 1. Department of Advanced Robotics (ADVR), Istituto Italiano di Tecnologia, Via Morego 30, 16136, Genoa, GE, Italy. sara.moccia@iit.it. 2. Department of Electronics, Information and Bioengineering (DEIB), Politecnico di Milano, Piazza Leonardo da Vinci, 32, 20133, Milan, MI, Italy. sara.moccia@iit.it. 3. Department of Advanced Robotics (ADVR), Istituto Italiano di Tecnologia, Via Morego 30, 16136, Genoa, GE, Italy. 4. Department of Electronics, Information and Bioengineering (DEIB), Politecnico di Milano, Piazza Leonardo da Vinci, 32, 20133, Milan, MI, Italy. 5. Department of Pathology, Hôpital Beaujon, DHU UNITY, AP-HP, Clichy, France. 6. INSERM UMR1149, Paris, France. 7. Department of HPB Surgery and Liver Transplantation, Hôpital Beaujon, AP-HP, Clichy, France. 8. Department of Nanophysics, Istituto Italiano di Tecnologia, Via Morego 30, 16136, Genoa, GE, Italy.
Abstract
PURPOSE: Fast and accurate graft hepatic steatosis (HS) assessment is of primary importance for lowering liver dysfunction risks after transplantation. Histopathological analysis of biopsied liver is the gold standard for assessing HS, despite being invasive and time consuming. Due to the short time availability between liver procurement and transplantation, surgeons perform HS assessment through clinical evaluation (medical history, blood tests) and liver texture visual analysis. Despite visual analysis being recognized as challenging in the clinical literature, few efforts have been invested to develop computer-assisted solutions for HS assessment. The objective of this paper is to investigate the automatic analysis of liver texture with machine learning algorithms to automate the HS assessment process and offer support for the surgeon decision process. METHODS: Forty RGB images of forty different donors were analyzed. The images were captured with an RGB smartphone camera in the operating room (OR). Twenty images refer to livers that were accepted and 20 to discarded livers. Fifteen randomly selected liver patches were extracted from each image. Patch size was [Formula: see text]. This way, a balanced dataset of 600 patches was obtained. Intensity-based features (INT), histogram of local binary pattern ([Formula: see text]), and gray-level co-occurrence matrix ([Formula: see text]) were investigated. Blood-sample features (Blo) were included in the analysis, too. Supervised and semisupervised learning approaches were investigated for feature classification. The leave-one-patient-out cross-validation was performed to estimate the classification performance. RESULTS: With the best-performing feature set ([Formula: see text]) and semisupervised learning, the achieved classification sensitivity, specificity, and accuracy were 95, 81, and 88%, respectively. CONCLUSIONS: This research represents the first attempt to use machine learning and automatic texture analysis of RGB images from ubiquitous smartphone cameras for the task of graft HS assessment. The results suggest that is a promising strategy to develop a fully automatic solution to assist surgeons in HS assessment inside the OR.
PURPOSE: Fast and accurate graft hepatic steatosis (HS) assessment is of primary importance for lowering liver dysfunction risks after transplantation. Histopathological analysis of biopsied liver is the gold standard for assessing HS, despite being invasive and time consuming. Due to the short time availability between liver procurement and transplantation, surgeons perform HS assessment through clinical evaluation (medical history, blood tests) and liver texture visual analysis. Despite visual analysis being recognized as challenging in the clinical literature, few efforts have been invested to develop computer-assisted solutions for HS assessment. The objective of this paper is to investigate the automatic analysis of liver texture with machine learning algorithms to automate the HS assessment process and offer support for the surgeon decision process. METHODS: Forty RGB images of forty different donors were analyzed. The images were captured with an RGB smartphone camera in the operating room (OR). Twenty images refer to livers that were accepted and 20 to discarded livers. Fifteen randomly selected liver patches were extracted from each image. Patch size was [Formula: see text]. This way, a balanced dataset of 600 patches was obtained. Intensity-based features (INT), histogram of local binary pattern ([Formula: see text]), and gray-level co-occurrence matrix ([Formula: see text]) were investigated. Blood-sample features (Blo) were included in the analysis, too. Supervised and semisupervised learning approaches were investigated for feature classification. The leave-one-patient-out cross-validation was performed to estimate the classification performance. RESULTS: With the best-performing feature set ([Formula: see text]) and semisupervised learning, the achieved classification sensitivity, specificity, and accuracy were 95, 81, and 88%, respectively. CONCLUSIONS: This research represents the first attempt to use machine learning and automatic texture analysis of RGB images from ubiquitous smartphone cameras for the task of graft HS assessment. The results suggest that is a promising strategy to develop a fully automatic solution to assist surgeons in HS assessment inside the OR.
Entities:
Keywords:
Liver; Machine learning; Surgical data science; Texture analysis; Transplantation
Authors: Andre Esteva; Brett Kuprel; Roberto A Novoa; Justin Ko; Susan M Swetter; Helen M Blau; Sebastian Thrun Journal: Nature Date: 2017-01-25 Impact factor: 49.962
Authors: Hasan Yersiz; Coney Lee; Fady M Kaldas; Johnny C Hong; Abbas Rana; Gabriel T Schnickel; Jason A Wertheim; Ali Zarrinpar; Vatche G Agopian; Jeffrey Gornbein; Bita V Naini; Charles R Lassman; Ronald W Busuttil; Henrik Petrowsky Journal: Liver Transpl Date: 2013-03-17 Impact factor: 5.799
Authors: A M D'Alessandro; M Kalayoglu; H W Sollinger; R M Hoffmann; A Reed; S J Knechtle; J D Pirsch; G R Hafez; D Lorentzen; F O Belzer Journal: Transplantation Date: 1991-01 Impact factor: 4.939
Authors: Kevin C Hewitt; Javad Ghassemi Rad; Hanna C McGregor; Erin Brouwers; Heidi Sapp; Michael A Short; Samia B Fashir; Haishan Zeng; Ian P Alwayn Journal: Analyst Date: 2015-08-26 Impact factor: 4.616
Authors: E Mor; G B Klintmalm; T A Gonwa; H Solomon; M J Holman; J F Gibbs; I Watemberg; R M Goldstein; B S Husberg Journal: Transplantation Date: 1992-02 Impact factor: 4.939
Authors: C S Bhati; M A Silva; S J Wigmore; S R Bramhall; D A Mayer; J A C Buckels; D A Neil; N Murphy; D F Mirza Journal: Transplant Proc Date: 2009-06 Impact factor: 1.066
Authors: Neta Gotlieb; Amirhossein Azhie; Divya Sharma; Ashley Spann; Nan-Ji Suo; Jason Tran; Ani Orchanian-Cheff; Bo Wang; Anna Goldenberg; Michael Chassé; Heloise Cardinal; Joseph Paul Cohen; Andrea Lodi; Melanie Dieude; Mamatha Bhat Journal: NPJ Digit Med Date: 2022-07-11
Authors: Fernando Pérez-Sanz; Miriam Riquelme-Pérez; Enrique Martínez-Barba; Jesús de la Peña-Moral; Alejandro Salazar Nicolás; Marina Carpes-Ruiz; Angel Esteban-Gil; María Del Carmen Legaz-García; María Antonia Parreño-González; Pablo Ramírez; Carlos M Martínez Journal: Sensors (Basel) Date: 2021-03-12 Impact factor: 3.576