Aseem Anand1, Michael J Morris2,3, Reza Kaboteh4, Mariana Reza5, Elin Trägårdh5, Naofumi Matsunaga6, Lars Edenbrandt4,5, Anders Bjartell7,8, Steven M Larson3,9, David Minarik10. 1. Division of Urological Cancers, Department of Translational Medicine, Malmö, Lund University, Lund, Sweden aseem.anand@med.lu.se. 2. Department of Medicine, Memorial Sloan Kettering Cancer Center, New York, New York. 3. Weil Cornell Medical College, New York, New York. 4. Department of Clinical Physiology, Sahlgrenska University Hospital, Gothenburg, Sweden. 5. Department of Clinical Physiology, Skåne University Hospital, Lund University, Malmö, Sweden. 6. Department of Radiology, Yamaguchi University Hospital, Yamaguchi, Japan. 7. Division of Urological Cancers, Department of Translational Medicine, Malmö, Lund University, Lund, Sweden. 8. Department of Urology, Lund University, Malmö, Sweden. 9. Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, New York; and. 10. Department of Radiation Physics, Skåne University Hospital, Malmö, Sweden.
Abstract
The effect of the procedural variability in image acquisition on the quantitative assessment of bone scan is unknown. Here, we have developed and performed preanalytical studies to assess the impact of the variability in scanning speed and in vendor-specific γ-camera on reproducibility and accuracy of the automated bone scan index (BSI). METHODS: Two separate preanalytical studies were performed: a patient study and a simulation study. In the patient study, to evaluate the effect on BSI reproducibility, repeated bone scans were prospectively obtained from metastatic prostate cancer patients enrolled in 3 groups (Grp). In Grp1, the repeated scan speed and the γ-camera vendor were the same as that of the original scan. In Grp2, the repeated scan was twice the speed of the original scan. In Grp3, the repeated scan used a different γ-camera vendor than that used in the original scan. In the simulation study, to evaluate the effect on BSI accuracy, bone scans of a virtual phantom with predefined skeletal tumor burden (phantom-BSI) were simulated against the range of image counts (0.2, 0.5, 1.0, and 1.5 million) and separately against the resolution settings of the γ-cameras. The automated BSI was measured with a computer-automated platform. Reproducibility was measured as the absolute difference between the repeated BSI values, and accuracy was measured as the absolute difference between the observed BSI and the phantom-BSI values. Descriptive statistics were used to compare the generated data. RESULTS: In the patient study, 75 patients, 25 in each group, were enrolled. The reproducibility of Grp2 (mean ± SD, 0.35 ± 0.59) was observed to be significantly lower than that of Grp1 (mean ± SD, 0.10 ± 0.13; P < 0.0001) and that of Grp3 (mean ± SD, 0.09 ± 0.10; P < 0.0001). However, no significant difference was observed between the reproducibility of Grp3 and Grp1 (P = 0.388). In the simulation study, the accuracy at 0.5 million counts (mean ± SD, 0.57 ± 0.38) and at 0.2 million counts (mean ± SD, 4.67 ± 0.85) was significantly lower than that observed at 1.5 million counts (mean ± SD, 0.20 ± 0.26; P < 0.0001). No significant difference was observed in the accuracy data of the simulation study with vendor-specific γ-cameras (P = 0.266). CONCLUSION: In this study, we observed that the automated BSI accuracy and reproducibility were dependent on scanning speed but not on the vendor-specific γ-cameras. Prospective BSI studies should standardize scanning speed of bone scans to obtain image counts at or above 1.5 million.
The effect of the procedural variability in image acquisition on the quantitative assessment of bone scan is unknown. Here, we have developed and performed preanalytical studies to assess the impact of the variability in scanning speed and in vendor-specific γ-camera on reproducibility and accuracy of the automated bone scan index (BSI). METHODS: Two separate preanalytical studies were performed: a patient study and a simulation study. In the patient study, to evaluate the effect on BSI reproducibility, repeated bone scans were prospectively obtained from metastatic prostate cancerpatients enrolled in 3 groups (Grp). In Grp1, the repeated scan speed and the γ-camera vendor were the same as that of the original scan. In Grp2, the repeated scan was twice the speed of the original scan. In Grp3, the repeated scan used a different γ-camera vendor than that used in the original scan. In the simulation study, to evaluate the effect on BSI accuracy, bone scans of a virtual phantom with predefined skeletal tumor burden (phantom-BSI) were simulated against the range of image counts (0.2, 0.5, 1.0, and 1.5 million) and separately against the resolution settings of the γ-cameras. The automated BSI was measured with a computer-automated platform. Reproducibility was measured as the absolute difference between the repeated BSI values, and accuracy was measured as the absolute difference between the observed BSI and the phantom-BSI values. Descriptive statistics were used to compare the generated data. RESULTS: In the patient study, 75 patients, 25 in each group, were enrolled. The reproducibility of Grp2 (mean ± SD, 0.35 ± 0.59) was observed to be significantly lower than that of Grp1 (mean ± SD, 0.10 ± 0.13; P < 0.0001) and that of Grp3 (mean ± SD, 0.09 ± 0.10; P < 0.0001). However, no significant difference was observed between the reproducibility of Grp3 and Grp1 (P = 0.388). In the simulation study, the accuracy at 0.5 million counts (mean ± SD, 0.57 ± 0.38) and at 0.2 million counts (mean ± SD, 4.67 ± 0.85) was significantly lower than that observed at 1.5 million counts (mean ± SD, 0.20 ± 0.26; P < 0.0001). No significant difference was observed in the accuracy data of the simulation study with vendor-specific γ-cameras (P = 0.266). CONCLUSION: In this study, we observed that the automated BSI accuracy and reproducibility were dependent on scanning speed but not on the vendor-specific γ-cameras. Prospective BSI studies should standardize scanning speed of bone scans to obtain image counts at or above 1.5 million.
Authors: Emilio Bombardieri; Cumali Aktolun; Richard P Baum; Angelika Bishof-Delaloye; John Buscombe; Jean François Chatal; Lorenzo Maffioli; Roy Moncayo; Luc Morteímans; Sven N Reske Journal: Eur J Nucl Med Mol Imaging Date: 2003-12 Impact factor: 9.236
Authors: David Ulmert; Reza Kaboteh; Josef J Fox; Caroline Savage; Michael J Evans; Hans Lilja; Per-Anders Abrahamsson; Thomas Björk; Axel Gerdtsson; Anders Bjartell; Peter Gjertsson; Peter Höglund; Milan Lomsky; Mattias Ohlsson; Jens Richter; May Sadik; Michael J Morris; Howard I Scher; Karl Sjöstrand; Alice Yu; Madis Suurküla; Lars Edenbrandt; Steven M Larson Journal: Eur Urol Date: 2012-01-27 Impact factor: 20.096
Authors: May Sadik; Madis Suurkula; Peter Höglund; Andreas Järund; Lars Edenbrandt Journal: Eur J Nucl Med Mol Imaging Date: 2008-03-29 Impact factor: 9.236
Authors: M Imbriaco; S M Larson; H W Yeung; O R Mawlawi; Y Erdi; E S Venkatraman; H I Scher Journal: Clin Cancer Res Date: 1998-07 Impact factor: 12.531
Authors: Andrew J Armstrong; Aseem Anand; Lars Edenbrandt; Eva Bondesson; Anders Bjartell; Anders Widmark; Cora N Sternberg; Roberto Pili; Helen Tuvesson; Örjan Nordle; Michael A Carducci; Michael J Morris Journal: JAMA Oncol Date: 2018-07-01 Impact factor: 31.777
Authors: Jose Mauricio Mota; Andrew J Armstrong; Steven M Larson; Josef J Fox; Michael J Morris Journal: Prostate Cancer Prostatic Dis Date: 2019-04-29 Impact factor: 5.554
Authors: Adnan Ali; Alex P Hoyle; Christopher C Parker; Christopher D Brawley; Adrian Cook; Claire Amos; Joanna Calvert; Hassan Douis; Malcolm D Mason; Gerhardt Attard; Mahesh K B Parmar; Matthew R Sydes; Nicholas D James; Noel W Clarke Journal: Eur Urol Oncol Date: 2020-06-24