| Literature DB >> 32318774 |
Allard W Olthof1,2, Peter M A van Ooijen3,4, Mohammad H Rezazade Mehrizi5.
Abstract
PURPOSE: To conduct a systematic review of the possibilities of artificial intelligence (AI) in neuroradiology by performing an objective, systematic assessment of available applications. To analyse the potential impacts of AI applications on the work of neuroradiologists.Entities:
Keywords: Artificial intelligence (AI); Machine learning; Neurology/diagnostic imaging; Organizational innovation; Radiology; Technography
Mesh:
Year: 2020 PMID: 32318774 PMCID: PMC7479016 DOI: 10.1007/s00234-020-02424-w
Source DB: PubMed Journal: Neuroradiology ISSN: 0028-3940 Impact factor: 2.804
Fig. 1Research flow chart. RSNA = Radiological Society of North America. ECR = European Congress of Radiology. ESR AIX = European Society of Radiology Artificial Intelligence Exhibition. ASNR = American Society of Neuroradiology. SIIM = Society for Imaging Informatics in Medicine. Neuroradiology is defined as applications related to the brain or spinal cord as anatomical areas or related to specific diseases of the brain or spinal cord. (*) company websites, LinkedIn.com, Crunchbase.com, and the FDA website
Codebook for classification of impact
| Definition | Inclusion/exclusion criteria and examples |
|---|---|
| Supporting | |
| The functionality helps some inefficient task but does not fundamentally change the primary/current workflow, the interference of human actors is still required, and the process of task/workflow is still the same. | (1) does |
| (2) make the process more efficient compared to prior activities | |
| (3) still | |
| (4) the system only helps humans to do their job | |
| Example: visualization the images and information | |
| Replacing | |
| The functionality performs a certain task that was previously conducted by a human actor; thus, now the human actor is (almost) | (1) changes the particular fundamental task |
| (2) does not require human involvement | |
| (3) technology | |
| (4) the task was previously conducted by human actors | |
| Example: autonomous reading and reporting radiology cases | |
| Extending | |
| Technology offers a functionality that was not previously performed by human actors or the previous systems, and now, with this new functionality, a | (1) creates a |
| (2) does require human involvement but solves the problem through an algorithm | |
| (3) improves/extends | |
| (4) the tasks were previously non-existent | |
| Example: provide diagnostic information that was not available before, such as a heatmap of suspicious areas. | |
Fig. 7Database structure. For data storage, a relational database was developed in Microsoft Access. For analysis, information from different tables can be combined in queries and exported for visualization
Workflow codebook
| Workflow | Definition |
|---|---|
| Information | Provide information to patients and referring physician about a radiological examination |
| Indication | Decide the indications for certain examinations, for example implementation of institutional guidelines that describe the indications for specific radiologic examinations |
| Decision support | Support referring physician in choosing an examination for a specific patient |
| Vetting | Decide what imaging protocol is needed for a specific patient |
| Acquisition | Give input to technicians during image acquisition about adaptations to the imaging protocol in case of unexpected findings or other questions of the technicians. |
| Post-processing (modality) | Perform post-processing, before sending images to PACS |
| Prioritization | Decide the order in which images are read by the radiologist |
| Post-processing (PACS) | Perform post-processing steps during case reading, including anatomical segmentation |
| Detection | Detect and annotate abnormal findings |
| Segmentation (pathology) | Segment abnormal findings |
| Quantification (anatomical) | Quantify certain anatomical structures |
| Quantification (pathology) | Quantify abnormal findings |
| Interpretation | Interpret the detected normal or abnormal imaging findings in the context of the clinical history and the request of the referring physician; the cognitive process of going from imaging findings to a differential diagnosis. |
| Reporting | Report in terms of free-text or structured reporting |
| Communication | Communicate radiological findings by other means than the radiology report, for example, in a Multidisciplinary Team Meeting or by phone in case of the communication of critical findings. |
| Peer review | Analyse cases for peer review and provide feedback to other radiologists |
| Quality assurance | Perform tasks related to quality assurance such as improving of the workflow, or assessing the quality of radiology reports |
Platform categories
| Category | Explanation |
|---|---|
| Small/“Umbrella” | Applications of one vendor accessible through one umbrella product. |
| Intermediate/“Storage box” | Platform for in-house development of AI algorithms |
| Large/“Market place” | Platform of AI tools for developers and customers, similar to app-stores for smartphones. |
Fig. 2The founding year of the 27 companies and the distribution of the companies among the size categories, according to the number of employees
Fig. 3Geographical distribution of acquired funding of 13 of the 27 companies summed and averaged. The amounts in the national currencies have been converted to Euro. The average funding is calculated for all companies in each continent
Number of applications per company
| Number of applications per company | Number of companies | Total number of applications |
|---|---|---|
| 3 | 3 (11%) | 9 (24%) |
| 2 | 4 (15%) | 8 (22%) |
| 1 | 20 (74%) | 20 (54%) |
Application platforms
| Description | Cloud_or_on-premise | Own_applications | Other_applications | Neuroradiology exclusive | Nr Neuro applicationa | Nr Application (total)b |
|---|---|---|---|---|---|---|
| Small/“Umbrella” | ||||||
| ACCIPIO ICH Platform | ||||||
| MaxQ AI’s diagnostic suite is being deployed directly onto both CT and PACS systems. | Both | Yes | No | Yes | 3 (8%) | 3 |
| e-Stroke Suite | ||||||
| e-Stroke Suite combines e-Aspects, e-CTA, and e-Mismatch. | Cloud-based | Yes | No | Yes | 2 (5%) | 3 |
| Intermediate/“Storage box” | ||||||
| CuraCloud | ||||||
| AI Development Services supply medical imaging AI expertise and technical capabilities to healthcare organizations to create their own quality and productivity innovations using computer vision, machine learning, natural language processing, and other advanced informatics. | Cloud-based | Yes | Yes | No | 1 (3%) | 9 |
| Incepto | ||||||
| Incepto provides a collaborative environment to co-create, develop and distribute revolutionary applications for the diagnosis and treatment of diseases. | Cloud-based | No | Yes | No | 1 (3%) | 8 |
| Large/“Marketplace” | ||||||
| Blackford | ||||||
| Blackford provides a single platform to access and manage a curated marketplace of regulatory approved medical image analysis applications and AI algorithms that add clinical value. | Cloud-based | Yes | Yes | No | 4 (11%) | 13 |
| EnvoyAI | ||||||
| EnvoyAI provides a developer platform, integrations and an API interface for algorithm developers, technology partners, and end users. | Both | Yes | Yes | No | 11 (30%) | 57 |
| Nuance AI Marketplace | ||||||
| Workflow-integrated market for diagnostic imaging AI algorithms. | Cloud-based | No | Yes | No | 9 (24%) | 25 |
aIn brackets are the number of applications that indicate working with this particular platform. One application can be related to 1 or more platforms. For 19 (51%) applications, it is unknown whether they can work with a platform
bTotal number of applications/tools available at this platform for both neuroradiology and other subspecialties
PACS integration and location of computation.
| PACS-integration | |
| Seamless | 23 (62%) |
| Manual | 5 (14%) |
| Separate | 1 (3%) |
| Modality integrationa | 3 (4%) |
| Unknown integration | 7 (19%) |
| Cloud or on-premise | |
| Cloud-based computation | 17 (46%) |
| On-premise computationb | 5 (14%) |
| Location unknown | 17 (46%) |
aThree applications have modality integration, in addition to seamless integration
bOne application is categorized as both cloud-based and on-premise computation
Fig. 5Approval of applications. Each circle represents several applications that are approved by a particular organization. The FDA is the Food and Drug Administration of the United States. CE means CE-marked. CE marking is a certification mark that indicates conformity with health, safety and environmental protection standards for applications sold within the European Economic Area (EEA). Numbers in the intersecting parts fall under two or more categories. The ‘other’ category represents the approval bodies of Australia, Canada, Korea, Singapore and Vietnam
Functionalities with explanation and examples
| Functionality | Count | Explanation and examples |
|---|---|---|
| Provides quantitative information about pathology | 13 (12%) | Measures the characteristics of pathologic findings. |
| Example: | ||
| Marks regions of interest or detects change | 38 (34%) | Detects and highlights abnormal findings visually. |
| Example: VIZ | ||
| Provides classification, diagnosis or outcome probabilities | 19 (17%) | Interprets imaging findings and provides a diagnosis or a standardized classification. |
| Examples: | ||
| Prepares report | 15 (14%) | Organizes the diagnostic findings in a report. |
| Example: | ||
| Automated derivation of brain biomarkers | 12 (11%) | Compares the quantitative information about anatomy or pathology with normal findings of a particular group. |
| Example: | ||
| Workflow optimization and triaging | 12 (11%) | Facilitates the efficacy of the diagnostic process. |
| Example: | ||
| Anatomical segmentation | 2 (2%) | Segments the images in normal anatomical areas. |
| Example: |
Fig. 6Workflow. The functionalities of each application are mapped to the items of the workflow of a radiologist, as described in the codebook in the methods section. Applications that detect and segment particular pathologic conditions are categorized under ‘detection’ and are not double-categorized under ‘segmentation (pathology)’. Applications that measure, for example, brain volume in the context of, for example, dementia, are categorized under ‘quantification (pathology)’ and not also under ‘quantification (anatomical)’
Fig. 4Sankey flow diagram. From left to right, the columns of items represent the companies, the functionalities and the impact. The size of each item corresponds to the relative value within the category. For example, the most frequent affordance is ‘quantification (pathology)’, and the most frequent impact is ‘supporting’