Hamzeh Ghasemzadeh1, Dimitar D Deliyski2, Robert E Hillman3, Daryush D Mehta3. 1. "Department of Communicative Sciences and Disorders" and "Department of Computational Mathematics Science and Engineering", Michigan State University, East Lansing, Michigan, USA. 2. "Department of Communicative Sciences and Disorders", Michigan State University, East Lansing, Michigan, USA. 3. "MGH Institute of Health Professions", "Center for Laryngeal Surgery and Voice Rehabilitation, Massachusetts General Hospital", "Department of Surgery, Harvard Medical School", and "Speech and Hearing Bioscience and Technology, Division of Medical Sciences", Harvard Medical School, Boston, MA, USA.
Abstract
OBJECTIVE: Calibrated horizontal measurements (e.g., mm) from endoscopic procedures could be utilized for advancement of evidence-based practice and personalized medicine. However, the size of an object in endoscopic images is not readily calibrated and depends on multiple factors, including the distance between the endoscope and the target surface. Additionally, acquired images may have significant non-linear distortion that would further complicate calibrated measurements. This study used a recently developed in-vivo laser-projection fiberoptic laryngoscope and proposes a method for calibrated spatial measurements. METHOD: A set of circular grids were recorded at multiple working distances. A statistical model was trained that would map from pixel length of the object, the working distance, and the spatial location of the target object into its mm length. RESULT: A detailed analysis of the performance of the proposed method is presented. The analyses have shown that the accuracy of the proposed method does not depend on the working distance and length of the target object. The estimated average magnitude of error was 0.27 mm, which is three times lower than the existing alternative. CONCLUSION: The presented method can achieve sub-millimeter accuracy in horizontal measurement. SIGNIFICANCE: Evidence-based practice and personalized medicine could significantly benefit from the proposed method. Implications of the findings for other endoscopic procedures are also discussed.
OBJECTIVE: Calibrated horizontal measurements (e.g., mm) from endoscopic procedures could be utilized for advancement of evidence-based practice and personalized medicine. However, the size of an object in endoscopic images is not readily calibrated and depends on multiple factors, including the distance between the endoscope and the target surface. Additionally, acquired images may have significant non-linear distortion that would further complicate calibrated measurements. This study used a recently developed in-vivo laser-projection fiberoptic laryngoscope and proposes a method for calibrated spatial measurements. METHOD: A set of circular grids were recorded at multiple working distances. A statistical model was trained that would map from pixel length of the object, the working distance, and the spatial location of the target object into its mm length. RESULT: A detailed analysis of the performance of the proposed method is presented. The analyses have shown that the accuracy of the proposed method does not depend on the working distance and length of the target object. The estimated average magnitude of error was 0.27 mm, which is three times lower than the existing alternative. CONCLUSION: The presented method can achieve sub-millimeter accuracy in horizontal measurement. SIGNIFICANCE: Evidence-based practice and personalized medicine could significantly benefit from the proposed method. Implications of the findings for other endoscopic procedures are also discussed.
Authors: Marion Semmler; Stefan Kniesburges; Veronika Birk; Anke Ziethe; Rita Patel; Michael Dollinger Journal: IEEE Trans Med Imaging Date: 2016-01-25 Impact factor: 10.048
Authors: Carolyn A Coughlan; Li-Dek Chou; Joseph C Jing; Jason J Chen; Swathi Rangarajan; Theodore H Chang; Giriraj K Sharma; Kyoungrai Cho; Donghoon Lee; Julie A Goddard; Zhongping Chen; Brian J F Wong Journal: Sci Rep Date: 2016-03-10 Impact factor: 4.379