| Literature DB >> 33974148 |
Merel Huisman1, Erik Ranschaert2, William Parker3, Domenico Mastrodicasa4, Martin Koci5, Daniel Pinto de Santos6, Francesca Coppola7, Sergey Morozov8, Marc Zins9, Cedric Bohyn10, Ural Koç11, Jie Wu12, Satyam Veean13, Dominik Fleischmann4, Tim Leiner14, Martin J Willemink4.
Abstract
OBJECTIVES: Currently, hurdles to implementation of artificial intelligence (AI) in radiology are a much-debated topic but have not been investigated in the community at large. Also, controversy exists if and to what extent AI should be incorporated into radiology residency programs.Entities:
Keywords: Artificial intelligence; Diagnostic imaging; Radiology; Surveys and questionnaires
Mesh:
Year: 2021 PMID: 33974148 PMCID: PMC8111651 DOI: 10.1007/s00330-021-07782-4
Source DB: PubMed Journal: Eur Radiol ISSN: 0938-7994 Impact factor: 5.315
Baseline characteristics of all respondents (n=1041)
| Category | ||
|---|---|---|
| Gender (male) | 670 (65%)a | |
| Age (median (range)) | 38 (24–70) | |
| Region | Africa | 14 (1%) |
| Asia | 73 (7%) | |
| Australia | 8 (1%) | |
| Europe | 867 (83%) | |
| North America | 65 (6%) | |
| South America | 14 (1%) | |
| Type of hospital | Academic | 471 (45%) |
| Non-academic | 367 (35%) | |
| Private | 203 (20%) | |
| Current position | Radiologist | 692 (66%) |
| Fellow | 27 (3%) | |
| Resident | 322 (31%) | |
| Subspecialization | Abdominal | 328 (32%) |
| Musculoskeletal | 214 (23%) | |
| Neuro | 208 (20%) | |
| Interventional | 183 (18%) | |
| Breast | 115 (11%) | |
| Cardiothoracic | 179 (17%) | |
| Pediatric | 89 (9%) | |
| Molecular/nuclear | 41 (4%) | |
| Advanced scientific backgroundb | No | 727 (70%) |
| PhD | 148 (14%) | |
| Research fellowship | 51 (5%) | |
| PhD & research fellowship | 23 (2%) | |
| Obtaining PhD/research fellowship | 92 (9%) | |
| Social media use (professional) | No | 477 (46%) |
| Yes | 564 (54%) | |
| 360 (64%) | ||
| 115 (20%) | ||
| 99 (18%) | ||
| 78 (14%) | ||
aPrefer not to say (n = 14)
bIn addition to medical school
Expectations and anticipated hurdles to implementation (n = 1041)
| Question | |
|---|---|
| Can AI help improve diagnostic radiology? | |
| Yes | 108 (10%) |
| Maybe | 926 (89%) |
| No | 7 (1%) |
| How can AI help diagnostic radiology? ( | |
| Second reader | 829 (78%) |
| Workflow optimization | 803 (77%) |
| Partial replacement | 493 (47%) |
| Full replacement | 11 (1%) |
| Workflow optimization only | 99 (10%) |
| Anticipated hurdles to implementation ( | |
| Costs of development | 363 (35%) |
| Cost of software itself | 400 (38%) |
| Lack of | |
| Trust of stakeholdersb | 376 (36%) |
| Knowledge of stakeholders | 584 (56%) |
| High-quality image data | 159 (15%) |
| High-quality image labels | 287 (28%) |
| Generalizability of the software | 410 (39%) |
| Ethical/legal issues | 630 (62%) |
| Limitations in digital infrastructure | 356 (35%) |
| Other | 14 (1%) |
aMultiple answers possible
bClinicians, staff, or management
Fig. 1The relation between participant age and when respondents expect AI will alter the radiological clinical setting
Independent predictors for term of expected impact of AI in diagnostic radiology and anticipated hurdles to its implementation
| Independent predictorsa | Adjusted OR (95% CI) | ||
|---|---|---|---|
| Term of expected impact | |||
| Short term (< 5 years) | Age (10-year interval) Female Heard of AI Intermediate AI-specific knowledge Advanced AI-specific knowledge Abdominal radiologists | 1.26 (1.07–1.47) 1.37 (1.02–1.84) 2.74 (1.14–6.57) 4.30 (1.79–10.26) 5.31 (2.13–13.23) 0.69 (0.51–0.93) | |
| Middle long-term (5-10 years) | Male Europe | 1.51 (1.14–2.00) 1.69 (1.18–2.42) | |
| Long term (> 10 years) | Age (10-year interval) Europe Social media use Intermediate AI-specific knowledge Advanced AI-specific knowledge | 0.64 (0.51–0.82) 0.54 (0.35–0.85) 0.60 (0.41–0.87) 0.23 (0.11–0.52) 0.17 (0.07–0.44) | |
| Anticipated hurdles | |||
| Costs (software or development) | - | - | NS |
| Lack trust in AI of stakeholders | Europe Cardiothoracic radiologists | 0.56 (0.40–0.81) 1.57 (1.11–2.22) | |
| Lack of knowledge or expertise of stakeholders | Private centers | 0.63 (0.24–0.94) | |
| Lack of high-quality image data | Europe Private centers Advanced AI-specific knowledge Breast radiologists Pediatric radiologists | 0.39 (CI 0.26–0.61) 0.49 (0.27–0.89) 3.37 (1.05–10.84) 0.43 (0.20–0.90) 2.13 (1.20–3.80) | |
| Lack of high-quality image labels | Advanced AI-specific knowledge | 5.42 (2.22–13.21) | |
| Lack of generalizability (i.e., external validity) | Age (10-year interval) Europe | 0.85 (0.73–0.99) 0.54 (0.38–0.77) | |
| Ethical and legal issues | Europe Basic AI-specific knowledge Intermediate AI-specific knowledge Advanced AI-specific knowledge Musculoskeletal radiologists | 0.59 (0.40–0.85) 0.68 (0.48–0.96) 2.90 (1.48–5.65) 2.85 (1.39–5.86) 1.44 (1.03–2.01) | |
| Limitations in digital infrastructure of the hospital/center | Non-academic centers Private centers Abdominal radiologists Cardiothoracic radiologists Interventional radiologists | 0.58 (0.42–0.82) 0.57 (0.37–0.87) 1.45 (1.08–1.95) 1.51 (1.05–2.15) 1.55 (1.09–2.21) | |
aCorrected for age, gender, region (European versus non-European), type of hospital (academic, non-academic, private), scientific background, current position (resident versus radiologist), professional social media use, knowledge of informatics/statistics, AI-specific knowledge, and subspecialty
Fig. 2Anticipated hurdles as indicated by respondents according to AI-specific knowledge level
Opinions and independent predictors for AI and imaging informatics in radiology curricula (n = 1041)
| Answers ( | Independent predictorsa | Adjusted OR (95% CI) | |
|---|---|---|---|
| AI should be incorporated in residency programs | |||
Yes, Maybe, No, | Age (10-year interval) Residents Heard of AI Intermediate AI-specific knowledge Advanced AI-specific knowledge Pediatric radiologists | 1.43 (1.20–1.74) 1.71 (1.09–2.68) 2.96 (1.48–5.89) 3.84 (1.90–7.77) 5.16 (2.33–11.43) 0.58 (0.35–0.98) | |
| AI/imaging informatics should be a subspecialty | |||
Yes, Maybe, No, | Social media use N/A N/A | 1.38 (1.01–1.89) N/A N/A | N/A N/A |
aCorrected for age, gender, region (European versus non-European), type of hospital (academic, non-academic, private), scientific background, current position (resident versus radiologist), professional social media use, knowledge of informatics/statistics, AI-specific knowledge, and subspecialty
Independent predictors for self-learning methods pertaining to artificial intelligence in radiology
| Self-learning method | Independent predictorsa | Adjusted OR (95% CI) | |
|---|---|---|---|
| Scientific literature | Age (10-year interval) Male Social media use Knowledge of statistics/informatics Intermediate AI-specific knowledge Advanced AI-specific knowledge | 0.85 (0.72–1.0) 1.56 (1.16–2.09) 1.54 (1.17–2.02) 1.52 (1.15–2.04) 2.54 (1.27–5.07) 5.25 (2.41–11.44) | |
| Conferences or specialty courses | Age (10-year interval) Male Social media use Basic AI-specific knowledge Intermediate AI-specific knowledge Advanced AI-specific knowledge | 0.83 (0.70–0.98) 0.71 (0.51–0.98) 1.47 (1.09–1.98) 0.65 (0.44–0.96) 2.86 (1.42–5.75) 3.87 (1.75–8.57) | |
| Online articles (non-scientific) | Social media use Scientific background Advanced AI-specific knowledge | 1.63 (1.26–2.13) 0.66 (0.49–0.90) 2.86 (1.38–5.91) | |
| E-learning platforms | Social media use Knowledge of statistics/informatics Basic AI-specific knowledge Advanced AI-specific knowledge Breast radiologists | 1.71 (1.31–2.24) 1.37 (1.04–1.82) 0.66 (0.47–0.92) 2.42 (1.14–5.12) 0.49 (0.31–0.77) | |
| Social media | Age (10-year interval) Europe Scientific background Social media use Abdominal radiologist | 0.65 (0.52–0.80) 0.43 (0.29–0.67) 0.60 (0.41–0.87) 8.50 (5.60–12.9) 0.63 (0.43–0.91) |
aCorrected for age, gender, region (European versus non-European), type of hospital (academic, non-academic, private), scientific background, current position (resident versus radiologist), professional social media use, knowledge of informatics/statistics, AI-specific knowledge, and subspecialty