Inês Machado1,2, Matthew Toews3, Jie Luo4,5, Prashin Unadkat6, Walid Essayed6, Elizabeth George4, Pedro Teodoro7, Herculano Carvalho8, Jorge Martins7, Polina Golland9, Steve Pieper4,10, Sarah Frisken4, Alexandra Golby6, William Wells4,9. 1. Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, 75 Francis St., Boston, MA, 02115, USA. ines7.prata.machado@gmail.com. 2. IDMEC, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais 1, 1049-001, Lisbon, Portugal. ines7.prata.machado@gmail.com. 3. École de Technologie Superieure, 1100 Notre-Dame St W, Montreal, QC, H3C 1K3, Canada. 4. Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, 75 Francis St., Boston, MA, 02115, USA. 5. Graduate School of Frontier Sciences, University of Tokyo, 5-1-5 Kashiwanoha, Kashiwa, Chiba, Japan. 6. Department of Neurosurgery, Brigham and Women's Hospital, Harvard Medical School, 75 Francis St., Boston, MA, 02115, USA. 7. IDMEC, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais 1, 1049-001, Lisbon, Portugal. 8. Department of Neurosurgery, CHLN, Hospital de Santa Maria, Avenida Professor Egas Moniz, 1649-035, Lisbon, Portugal. 9. Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, 32 Vassar St, Cambridge, MA, 02139, USA. 10. Isomics, Inc., 55 Kirkland St, Cambridge, MA, 02138, USA.
Abstract
PURPOSE: The brain undergoes significant structural change over the course of neurosurgery, including highly nonlinear deformation and resection. It can be informative to recover the spatial mapping between structures identified in preoperative surgical planning and the intraoperative state of the brain. We present a novel feature-based method for achieving robust, fully automatic deformable registration of intraoperative neurosurgical ultrasound images. METHODS: A sparse set of local image feature correspondences is first estimated between ultrasound image pairs, after which rigid, affine and thin-plate spline models are used to estimate dense mappings throughout the image. Correspondences are derived from 3D features, distinctive generic image patterns that are automatically extracted from 3D ultrasound images and characterized in terms of their geometry (i.e., location, scale, and orientation) and a descriptor of local image appearance. Feature correspondences between ultrasound images are achieved based on a nearest-neighbor descriptor matching and probabilistic voting model similar to the Hough transform. RESULTS: Experiments demonstrate our method on intraoperative ultrasound images acquired before and after opening of the dura mater, during resection and after resection in nine clinical cases. A total of 1620 automatically extracted 3D feature correspondences were manually validated by eleven experts and used to guide the registration. Then, using manually labeled corresponding landmarks in the pre- and post-resection ultrasound images, we show that our feature-based registration reduces the mean target registration error from an initial value of 3.3 to 1.5 mm. CONCLUSIONS: This result demonstrates that the 3D features promise to offer a robust and accurate solution for 3D ultrasound registration and to correct for brain shift in image-guided neurosurgery.
PURPOSE: The brain undergoes significant structural change over the course of neurosurgery, including highly nonlinear deformation and resection. It can be informative to recover the spatial mapping between structures identified in preoperative surgical planning and the intraoperative state of the brain. We present a novel feature-based method for achieving robust, fully automatic deformable registration of intraoperative neurosurgical ultrasound images. METHODS: A sparse set of local image feature correspondences is first estimated between ultrasound image pairs, after which rigid, affine and thin-plate spline models are used to estimate dense mappings throughout the image. Correspondences are derived from 3D features, distinctive generic image patterns that are automatically extracted from 3D ultrasound images and characterized in terms of their geometry (i.e., location, scale, and orientation) and a descriptor of local image appearance. Feature correspondences between ultrasound images are achieved based on a nearest-neighbor descriptor matching and probabilistic voting model similar to the Hough transform. RESULTS: Experiments demonstrate our method on intraoperative ultrasound images acquired before and after opening of the dura mater, during resection and after resection in nine clinical cases. A total of 1620 automatically extracted 3D feature correspondences were manually validated by eleven experts and used to guide the registration. Then, using manually labeled corresponding landmarks in the pre- and post-resection ultrasound images, we show that our feature-based registration reduces the mean target registration error from an initial value of 3.3 to 1.5 mm. CONCLUSIONS: This result demonstrates that the 3D features promise to offer a robust and accurate solution for 3D ultrasound registration and to correct for brain shift in image-guided neurosurgery.
Authors: Thomas S Pheiffer; Reid C Thompson; Daniel C Rucker; Amber L Simpson; Michael I Miga Journal: Ultrasound Med Biol Date: 2014-01-10 Impact factor: 2.998
Authors: Laurence Mercier; Rolando F Del Maestro; Kevin Petrecca; David Araujo; Claire Haegelen; D Louis Collins Journal: Med Phys Date: 2012-06 Impact factor: 4.071
Authors: Mustafa Aziz Hatiboglu; Jeffrey S Weinberg; Dima Suki; Ganesh Rao; Sujit S Prabhu; Komal Shah; Ed Jackson; Raymond Sawaya Journal: Neurosurgery Date: 2009-06 Impact factor: 4.654
Authors: Timothy J Brown; Matthew C Brennan; Michael Li; Ephraim W Church; Nicholas J Brandmeir; Kevin L Rakszawski; Akshal S Patel; Elias B Rizk; Dima Suki; Raymond Sawaya; Michael Glantz Journal: JAMA Oncol Date: 2016-11-01 Impact factor: 31.777
Authors: Inês Machado; Matthew Toews; Elizabeth George; Prashin Unadkat; Walid Essayed; Jie Luo; Pedro Teodoro; Herculano Carvalho; Jorge Martins; Polina Golland; Steve Pieper; Sarah Frisken; Alexandra Golby; William Wells Iii; Yangming Ou Journal: Neuroimage Date: 2019-08-22 Impact factor: 6.556
Authors: Sarah Frisken; Ma Luo; Parikshit Juvekar; Adomas Bunevicius; Ines Machado; Prashin Unadkat; Melina M Bertotti; Matt Toews; William M Wells; Michael I Miga; Alexandra J Golby Journal: Int J Comput Assist Radiol Surg Date: 2019-08-23 Impact factor: 2.924
Authors: Saramati Narasimhan; Jared A Weis; Ma Luo; Amber L Simpson; Reid C Thompson; Michael I Miga Journal: J Med Imaging (Bellingham) Date: 2020-06-22
Authors: Nazim Haouchine; Parikshit Juvekar; Alexandra Golby; William M Wells; Stephane Cotin; Sarah Frisken Journal: Proc SPIE Int Soc Opt Eng Date: 2020-03-16
Authors: Nazim Haouchine; Parikshit Juvekar; Michael Nercessian; William Wells; Alexandra Golby; Sarah Frisken Journal: IEEE Trans Biomed Eng Date: 2022-03-18 Impact factor: 4.538
Authors: Dhiego Chaves De Almeida Bastos; Parikshit Juvekar; Yanmei Tie; Nick Jowkar; Steve Pieper; Willam M Wells; Wenya Linda Bi; Alexandra Golby; Sarah Frisken; Tina Kapur Journal: Front Oncol Date: 2021-05-03 Impact factor: 6.244
Authors: Jonathan Shapey; Thomas Dowrick; Rémi Delaunay; Eleanor C Mackle; Stephen Thompson; Mirek Janatka; Roland Guichard; Anastasis Georgoulas; David Pérez-Suárez; Robert Bradford; Shakeel R Saeed; Sébastien Ourselin; Matthew J Clarkson; Tom Vercauteren Journal: Int J Comput Assist Radiol Surg Date: 2021-05-03 Impact factor: 3.421