| Literature DB >> 34983096 |
Chae Jung Park1, Yae Won Park2, Sung Soo Ahn2, Dain Kim3, Eui Hyun Kim4, Seok-Gu Kang4, Jong Hee Chang4, Se Hoon Kim5, Seung-Koo Lee6.
Abstract
OBJECTIVE: Our study aimed to evaluate the quality of radiomics studies on brain metastases based on the radiomics quality score (RQS), Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) checklist, and the Image Biomarker Standardization Initiative (IBSI) guidelines.Entities:
Keywords: Brain metastasis; Machine learning; Quality improvement; Radiomics; Radiomics quality score
Mesh:
Substances:
Year: 2022 PMID: 34983096 PMCID: PMC8743155 DOI: 10.3348/kjr.2021.0421
Source DB: PubMed Journal: Korean J Radiol ISSN: 1229-6929 Impact factor: 3.500
Fig. 1Flow chart of the study selection process.
Characteristics of the 29 Included Radiomics Studies
| Article Characteristics | Data | |
|---|---|---|
| No. of patients, median (range) | 77 (24–439) | |
| Journal type | ||
| Clinical journal | 7 (24.1) | |
| Imaging journal | 18 (62.1) | |
| Computer science/neuroscience journal | 4 (13.8) | |
| Biomarker | ||
| Diagnostic | 21 (72.4) | |
| Predictive | 0 (0.0) | |
| Prognostic | 8 (27.6) | |
| Topics in brain metastasis | ||
| Differentiation of brain metastasis and other brain tumors | 8 (27.6) | |
| Differentiation of brain metastasis and radiation necrosis after radiosurgery | 4 (13.8) | |
| Prediction of primary tumor type in patients with brain metastasis | 4 (13.8) | |
| Prediction of specific gene mutations in patients with brain metastasis | 5 (17.2) | |
| Prediction of outcome of radiosurgery in patients with brain metastasis | 6 (20.7) | |
| Prediction of survival in patients with brain metastasis | 2 (6.9) | |
| Sequence used for feature extraction* | ||
| Conventional images | 23 (82.1) | |
| Advanced images (diffusion-weighted image or diffusion tensor image) | 5 (17.9) | |
| Segmentation method | ||
| Automatic | 1 (3.4) | |
| Semi-automatic | 9 (31.0) | |
| Manual | 18 (62.1) | |
| Both automatic and manual | 1 (3.4) | |
| External validation | ||
| Performed | 3 (10.3) | |
| Not performed | 26 (89.7) | |
| Magnetic field strength (Tesla)* | ||
| 1.5T | 10 (35.7) | |
| 3T | 11 (39.3) | |
| 1.5T & 3T | 6 (21.4) | |
| Not illustrated | 1 (3.6) | |
Data are the number of studies with a percentage in parentheses unless specified otherwise. *Relevant for 28 radiomics studies that used MRI except for one study that used PET.
Radiomics Quality Score according to the Six Key Domains
| Basic Adherence Rate (%)* | Median (Range) | Percentage of the Ideal Score (%)† | ||
|---|---|---|---|---|
| Total 16 items (ideal score 36) | 50.0 | 3 (-6–12) | 9.6 (3.4) | |
| Domain 1: protocol quality and stability in image and segmentation (0 to 5 points) | 69.0 (20) | 1 (0–2) | 20.0 (1.0) | |
| Protocol quality (2 points) | 76.0 (19) | 1 (0–1) | 34.5 (0.7) | |
| Multiple segmentations (1 point) | 16.0 (4) | 0 (0–1) | 31.0 (0.3) | |
| Test-retest (1 point) | 0 (0) | 0 (0–0) | 0 (0) | |
| Phantom study (1 point) | 0 (0) | 0 (0–0) | 0 (0) | |
| Domain 2: feature selection and validation (-8 to 8 points) | 82.8 (24) | -2 (-8–6) | -4.3 (-0.3) | |
| Feature reduction or adjustment of multiple testing (-3 or 3 points) | 82.8 (24) | 3 (-3–3) | 65.5 (2.0) | |
| Validation (-5, 2, 3, 4, or 5 points) | 37.9 (11) | -5 (-5–3) | -46.2 (-2.3) | |
| Domain 3: biologic/clinical validation and utility (0 to 6 points) | 55.2 (16) | 1 (0–4) | 16.7 (1.0) | |
| Non-radiomics features (1 point) | 44.8 (13) | 0 (0–1) | 44.8 (0.4) | |
| Biologic correlations (1 point) | 13.8 (4) | 0 (0–1) | 13.8 (0.1) | |
| Comparison to “gold standard” (2 points) | 10.3 (3) | 0 (0–2) | 8.6 (0.2) | |
| Potential clinical utility (2 points) | 10.3 (3) | 0 (0–2) | 10.3 (0.2) | |
| Domain 4: model performance index (0 to 5 points) | 89.7 (26) | 2 (0–4) | 35.2 (1.8) | |
| Cut-off analysis (1 point) | 3.4 (1) | 0 (0–1) | 3.4 (0.0) | |
| Discrimination statistics (2 points) | 89.7 (26) | 2 (0–2) | 81.0 (1.6) | |
| Calibration statistics (2 points) | 6.9 (2) | 0 (0–2) | 5.2 (0.1) | |
| Domain 5: high level of evidence (0 to 8 points) | 0 (0) | 0 (0–0) | 0 (0) | |
| Prospective study (7 points) | 0 (0) | 0 (0–0) | 0 (0) | |
| Cost-effectiveness analysis (1 point) | 0 (0) | 0 (0–0) | 0 (0) | |
| Domain 6: open science and data (0 to 4 points) | 3.4 (1) | 0 (0–1) | 0.9 (0.0) | |
*Numbers in parentheses are the number of studies, †Numbers in parentheses are the mean values.
Fig. 2RQS assessment results according to the six key domains.
RQS = radiomics quality score
Adherence of Radiomics Studies to Individual Transparent Reporting of a Multivariable Prediction Model for Individual Prognosis Or Diagnosis Checklist Items
| Total (35 Items) | Data (%) | ||
|---|---|---|---|
| Title and abstract (n = 29) | |||
| Title | 1. Identify developing/validating a model, target population, and the outcome | 1 (3.4) | |
| Abstract | 2. Provide a summary of objectives, study design, setting, participants, sample size, predictors, outcome, statistical analysis, results, and conclusions | 23 (79.3) | |
| Introduction (n = 29) | |||
| Background and objectives | 3a. Explain the medical context and rationale for developing/validating the model | 26 (89.7) | |
| 3b. Specify the objectives, including whether the study describes the development/validation of the model or both | 24 (82.8) | ||
| Methods (n = 29 except for item 5c, for which n = 12) | |||
| Source of data | 4a. Describe the study design or source of data (randomized trial, cohort, or registry data) | 26 (89.7) | |
| 4b. Specify the key dates | 25 (86.2) | ||
| Participants | 5a. Specify key elements of the study setting including number and location of centers | 4 (13.8) | |
| 5b. Describe eligibility criteria for participants (inclusion and exclusion criteria) | 22 (75.9) | ||
| 5c. Give details of treatment received, if relevant (n = 12) | 9 out of 12 | ||
| Outcome | 6a. Clearly define the outcome, including how and when assessed | 20 (69.0) | |
| 6b. Report any actions to blind assessment of the outcome | 0 (0) | ||
| Predictors | 7a. Clearly define all predictors, including how and when assessed | 28 (96.6) | |
| 7b. Report any actions to blind assessment of predictors for the outcome and other predictors | 7 (24.1) | ||
| Sample size | 8. Explain how the study size was arrived at | 11 (37.9) | |
| Missing data | 9. Describe how missing data were handled with details of any imputation method | 0 (0) | |
| Statistical analysis methods | 10a. Describe how predictors were handled | 29 (100) | |
| 10b. Specify type of model, all model-building procedures (any predictor selection), and method for internal validation | 27 (93.1) | ||
| 10d. Specify all measures used to assess model performance and if relevant, to compare multiple models (discrimination and calibration) | 26 (89.7) | ||
| Risk groups | 11. Provide details on how risk groups were created, if done | 3 (10.3) | |
| Results (n = 29 except for item 14b, for which n = 20) | |||
| Participants | 13a. Describe the flow of participants, including the number of participants with and without the outcome. A diagram may be helpful | 10 (34.5) | |
| 13b. Describe the characteristics of the participants, including the number of participants with missing data for predictors and outcome | 21 (72.4) | ||
| Model development | 14a. Specify the number of participants and outcome events in each analysis | 26 (89.7) | |
| 14b. Report the unadjusted association between each candidate predictor and outcome, if done (yes or no, n = 20) | 13 (44.8) | ||
| Model specification | 15a. Present the full prediction model to allow predictions for individuals (regression coefficients, intercept) | 0 (0) | |
| 15b. Explain how to the use the prediction model (nomogram, calculator, etc.) | 3 (10.3) | ||
| Model performance | 16. Report performance measures (with confidence intervals) for the prediction model | 22 (75.9) | |
| Discussion | |||
| Limitations | 18. Discuss any limitations of the study | 26 (89.7) | |
| Interpretation | 19b. Give an overall interpretation of the results | 28 (96.6) | |
| Implications | 20. Discuss the potential clinical use of the model and implications for future research | 24 (82.8) | |
| For validation (types 2a, 2b, 3, and 4) (n = 7 except for items 10e and17, for which n = 0) | |||
| Methods-statistical analysis methods | 10c. Methods-Statistical analysis methods: describe how the predictions were calculated | 6 (85.7) | |
| 10e. Describe any model updating (recalibration), if done (n = 0) | 0 (0) | ||
| Methods | 12. Identify any differences from the development data in setting, eligibility criteria, outcome, and predictors | 4 (57.1) | |
| Results | 13c. Show a comparison with the development data of the distribution of important variables | 4 (57.1) | |
| Results-model updating | 17. Report the results from any model updating, if done (n = 0) | 0 (0) | |
| Discussion-interpretation | 19a. Discuss the results with reference to performance in the development data and any other validation data | 2 (28.6) | |
Data are the number of studies.
Quality of Image Processing and Radiomics Feature Extraction according to the Image Biomarker Standardization Initiative Guidelines
| Pre-Processing Performed | Number of Studies (%) | |
|---|---|---|
| Bias-field correction | 6 (20.7)* | |
| Isovoxel resampling | 9 (31.0) | |
| Skull stripping | 4 (13.8) | |
| Gray-level discretization | 4 (13.8) | |
| Signal intensity normalization | 20 (69.0) | |
| Software for feature extraction | ||
| Pyradiomics | 9 (31.0) | |
| Matlab | 8 (27.6) | |
| LIFEx | 4 (13.8) | |
| IBEX | 2 (6.9) | |
| Others | 4 (13.8) | |
| N/A | 2 (6.9) | |
*One study performed N3 bias correction, and the remaining five studies performed N4 bias correction. N/A = not available
Fig. 3Reporting of image pre-processing and radiomics feature extraction according to Image Biomarker Standardization Initiative.
N/A = not available