Julia Andresen1, Timo Kepp2, Jan Ehrhardt2,3, Claus von der Burchard4, Johann Roider4, Heinz Handels2,3. 1. Institute of Medical Informatics, University of Lübeck, Ratzeburger Allee 160, 23562, Lübeck, Germany. j.andresen@uni-luebeck.de. 2. Institute of Medical Informatics, University of Lübeck, Ratzeburger Allee 160, 23562, Lübeck, Germany. 3. German Research Center for Artificial Intelligence, Lübeck, Germany. 4. Department of Ophthalmology, Christian-Albrechts-University of Kiel, Kiel, Germany.
Abstract
PURPOSE: The registration of medical images often suffers from missing correspondences due to inter-patient variations, pathologies and their progression leading to implausible deformations that cause misregistrations and might eliminate valuable information. Detecting non-corresponding regions simultaneously with the registration process helps generating better deformations and has been investigated thoroughly with classical iterative frameworks but rarely with deep learning-based methods. METHODS: We present the joint non-correspondence segmentation and image registration network (NCR-Net), a convolutional neural network (CNN) trained on a Mumford-Shah-like functional, transferring the classical approach to the field of deep learning. NCR-Net consists of one encoding and two decoding parts allowing the network to simultaneously generate diffeomorphic deformations and segment non-correspondences. The loss function is composed of a masked image distance measure and regularization of deformation field and segmentation output. Additionally, anatomical labels are used for weak supervision of the registration task. No manual segmentations of non-correspondences are required. RESULTS: The proposed network is evaluated on the publicly available LPBA40 dataset with artificially added stroke lesions and a longitudinal optical coherence tomography (OCT) dataset of patients with age-related macular degeneration. The LPBA40 data are used to quantitatively assess the segmentation performance of the network, and it is shown qualitatively that NCR-Net can be used for the unsupervised segmentation of pathologies in OCT images. Furthermore, NCR-Net is compared to a registration-only network and state-of-the-art registration algorithms showing that NCR-Net achieves competitive performance and superior robustness to non-correspondences. CONCLUSION: NCR-Net, a CNN for simultaneous image registration and unsupervised non-correspondence segmentation, is presented. Experimental results show the network's ability to segment non-correspondence regions in an unsupervised manner and its robust registration performance even in the presence of large pathologies.
PURPOSE: The registration of medical images often suffers from missing correspondences due to inter-patient variations, pathologies and their progression leading to implausible deformations that cause misregistrations and might eliminate valuable information. Detecting non-corresponding regions simultaneously with the registration process helps generating better deformations and has been investigated thoroughly with classical iterative frameworks but rarely with deep learning-based methods. METHODS: We present the joint non-correspondence segmentation and image registration network (NCR-Net), a convolutional neural network (CNN) trained on a Mumford-Shah-like functional, transferring the classical approach to the field of deep learning. NCR-Net consists of one encoding and two decoding parts allowing the network to simultaneously generate diffeomorphic deformations and segment non-correspondences. The loss function is composed of a masked image distance measure and regularization of deformation field and segmentation output. Additionally, anatomical labels are used for weak supervision of the registration task. No manual segmentations of non-correspondences are required. RESULTS: The proposed network is evaluated on the publicly available LPBA40 dataset with artificially added stroke lesions and a longitudinal optical coherence tomography (OCT) dataset of patients with age-related macular degeneration. The LPBA40 data are used to quantitatively assess the segmentation performance of the network, and it is shown qualitatively that NCR-Net can be used for the unsupervised segmentation of pathologies in OCT images. Furthermore, NCR-Net is compared to a registration-only network and state-of-the-art registration algorithms showing that NCR-Net achieves competitive performance and superior robustness to non-correspondences. CONCLUSION: NCR-Net, a CNN for simultaneous image registration and unsupervised non-correspondence segmentation, is presented. Experimental results show the network's ability to segment non-correspondence regions in an unsupervised manner and its robust registration performance even in the presence of large pathologies.
Authors: Xiaoxiao Liu; Marc Niethammer; Roland Kwitt; Nikhil Singh; Matt McCormick; Stephen Aylward Journal: IEEE Trans Med Imaging Date: 2015-06-22 Impact factor: 10.048
Authors: Marc Niethammer; Gabriel L Hart; Danielle F Pace; Paul M Vespa; Andrei Irimia; John D Van Horn; Stephen R Aylward Journal: Med Image Comput Comput Assist Interv Date: 2011
Authors: Bjoern H Menze; Heinz Handels; Mauricio Reyes; Oskar Maier; Janina von der Gablentz; Levin Ḧani; Mattias P Heinrich; Matthias Liebrand; Stefan Winzeck; Abdul Basit; Paul Bentley; Liang Chen; Daan Christiaens; Francis Dutil; Karl Egger; Chaolu Feng; Ben Glocker; Michael Götz; Tom Haeck; Hanna-Leena Halme; Mohammad Havaei; Khan M Iftekharuddin; Pierre-Marc Jodoin; Konstantinos Kamnitsas; Elias Kellner; Antti Korvenoja; Hugo Larochelle; Christian Ledig; Jia-Hong Lee; Frederik Maes; Qaiser Mahmood; Klaus H Maier-Hein; Richard McKinley; John Muschelli; Chris Pal; Linmin Pei; Janaki Raman Rangarajan; Syed M S Reza; David Robben; Daniel Rueckert; Eero Salli; Paul Suetens; Ching-Wei Wang; Matthias Wilms; Jan S Kirschke; Ulrike M Kr Amer; Thomas F Münte; Peter Schramm; Roland Wiest Journal: Med Image Anal Date: 2016-07-21 Impact factor: 8.545
Authors: Thomas Kurmann; Siqing Yu; Pablo Márquez-Neila; Andreas Ebneter; Martin Zinkernagel; Marion R Munk; Sebastian Wolf; Raphael Sznitman Journal: Sci Rep Date: 2019-09-19 Impact factor: 4.379