PURPOSE: With recent substantial improvements in modern computing, interest in quantitative imaging with CT has seen a dramatic increase. As a result, the need to both create and analyze large, high-quality datasets of clinical studies has increased as well. At present, no efficient, widely available method exists to accomplish this. The purpose of this technical note is to describe an open-source high-throughput computational pipeline framework for the reconstruction and analysis of diagnostic CT imaging data to conduct large-scale quantitative imaging studies and to accelerate and improve quantitative imaging research. METHODS: The pipeline consists of two, primary "blocks": reconstruction and analysis. Reconstruction is carried out via a graphics processing unit (GPU) queuing framework developed specifically for the pipeline that allows a dataset to be reconstructed using a variety of different parameter configurations such as slice thickness, reconstruction kernel, and simulated acquisition dose. The analysis portion then automatically analyzes the output of the reconstruction using "modules" that can be combined in various ways to conduct different experiments. Acceleration of analysis is achieved using cluster processing. Efficiency and performance of the pipeline are demonstrated using an example 142 subject lung screening cohort reconstructed 36 different ways and analyzed using quantitative emphysema scoring techniques. RESULTS: The pipeline reconstructed and analyzed the 5112 reconstructed datasets in approximately 10 days, a roughly 72× speedup over previous efforts using the scanner for reconstructions. Tightly coupled pipeline quality assurance software ensured proper performance of analysis modules with regard to segmentation and emphysema scoring. CONCLUSIONS: The pipeline greatly reduced the time from experiment conception to quantitative results. The modular design of the pipeline allows the high-throughput framework to be utilized for other future experiments into different quantitative imaging techniques. Future applications of the pipeline being explored are robustness testing of quantitative imaging metrics, data generation for deep learning, and use as a test platform for image-processing techniques to improve clinical quantitative imaging.
PURPOSE: With recent substantial improvements in modern computing, interest in quantitative imaging with CT has seen a dramatic increase. As a result, the need to both create and analyze large, high-quality datasets of clinical studies has increased as well. At present, no efficient, widely available method exists to accomplish this. The purpose of this technical note is to describe an open-source high-throughput computational pipeline framework for the reconstruction and analysis of diagnostic CT imaging data to conduct large-scale quantitative imaging studies and to accelerate and improve quantitative imaging research. METHODS: The pipeline consists of two, primary "blocks": reconstruction and analysis. Reconstruction is carried out via a graphics processing unit (GPU) queuing framework developed specifically for the pipeline that allows a dataset to be reconstructed using a variety of different parameter configurations such as slice thickness, reconstruction kernel, and simulated acquisition dose. The analysis portion then automatically analyzes the output of the reconstruction using "modules" that can be combined in various ways to conduct different experiments. Acceleration of analysis is achieved using cluster processing. Efficiency and performance of the pipeline are demonstrated using an example 142 subject lung screening cohort reconstructed 36 different ways and analyzed using quantitative emphysema scoring techniques. RESULTS: The pipeline reconstructed and analyzed the 5112 reconstructed datasets in approximately 10 days, a roughly 72× speedup over previous efforts using the scanner for reconstructions. Tightly coupled pipeline quality assurance software ensured proper performance of analysis modules with regard to segmentation and emphysema scoring. CONCLUSIONS: The pipeline greatly reduced the time from experiment conception to quantitative results. The modular design of the pipeline allows the high-throughput framework to be utilized for other future experiments into different quantitative imaging techniques. Future applications of the pipeline being explored are robustness testing of quantitative imaging metrics, data generation for deep learning, and use as a test platform for image-processing techniques to improve clinical quantitative imaging.
Authors: Stefano Young; Hyun J Grace Kim; Moe Moe Ko; War War Ko; Carlos Flores; Michael F McNitt-Gray Journal: Med Phys Date: 2015-05 Impact factor: 4.071
Authors: Matthew S Brown; Pechin Lo; Jonathan G Goldin; Eran Barnoy; Grace Hyun J Kim; Michael F McNitt-Gray; Denise R Aberle Journal: Eur Radiol Date: 2014-07-24 Impact factor: 5.315
Authors: Hu Chen; Yi Zhang; Mannudeep K Kalra; Feng Lin; Yang Chen; Peixi Liao; Jiliu Zhou; Ge Wang Journal: IEEE Trans Med Imaging Date: 2017-06-13 Impact factor: 10.048
Authors: David S Gierada; Thomas K Pilgram; Bruce R Whiting; Cheng Hong; Andrew J Bierhals; Jin Hwan Kim; Kyongtae T Bae Journal: AJR Am J Roentgenol Date: 2007-01 Impact factor: 3.959
Authors: David S Gierada; Andrew J Bierhals; Cliff K Choong; Seth T Bartel; Jon H Ritter; Nitin A Das; Cheng Hong; Thomas K Pilgram; Kyongtae T Bae; Bruce R Whiting; Jason C Woods; James C Hogg; Barbara A Lutey; Richard J Battafarano; Joel D Cooper; Bryan F Meyers; G Alexander Patterson Journal: Acad Radiol Date: 2010-02 Impact factor: 3.173
Authors: Kirsten L Boedeker; Michael F McNitt-Gray; Sarah R Rogers; Dao A Truong; Matthew S Brown; David W Gjertson; Jonathan G Goldin Journal: Radiology Date: 2004-07 Impact factor: 11.105
Authors: Nastaran Emaminejad; Muhammad Wasil Wahi-Anwar; Grace Hyun J Kim; William Hsu; Matthew Brown; Michael McNitt-Gray Journal: Med Phys Date: 2021-04-13 Impact factor: 4.506