Literature DB >> 32458173

Implementation of artificial intelligence (AI) applications in radiology: hindering and facilitating factors.

Lea Strohm1, Charisma Hehakaya2, Erik R Ranschaert3, Wouter P C Boon1, Ellen H M Moors1.   

Abstract

OBJECTIVE: The objective was to identify barriers and facilitators to the implementation of artificial intelligence (AI) applications in clinical radiology in The Netherlands.
MATERIALS AND METHODS: Using an embedded multiple case study, an exploratory, qualitative research design was followed. Data collection consisted of 24 semi-structured interviews from seven Dutch hospitals. The analysis of barriers and facilitators was guided by the recently published Non-adoption, Abandonment, Scale-up, Spread, and Sustainability (NASSS) framework for new medical technologies in healthcare organizations.
RESULTS: Among the most important facilitating factors for implementation were the following: (i) pressure for cost containment in the Dutch healthcare system, (ii) high expectations of AI's potential added value, (iii) presence of hospital-wide innovation strategies, and (iv) presence of a "local champion." Among the most prominent hindering factors were the following: (i) inconsistent technical performance of AI applications, (ii) unstructured implementation processes, (iii) uncertain added value for clinical practice of AI applications, and (iv) large variance in acceptance and trust of direct (the radiologists) and indirect (the referring clinicians) adopters.
CONCLUSION: In order for AI applications to contribute to the improvement of the quality and efficiency of clinical radiology, implementation processes need to be carried out in a structured manner, thereby providing evidence on the clinical added value of AI applications. KEY POINTS: • Successful implementation of AI in radiology requires collaboration between radiologists and referring clinicians. • Implementation of AI in radiology is facilitated by the presence of a local champion. • Evidence on the clinical added value of AI in radiology is needed for successful implementation.

Entities:  

Keywords:  Artificial intelligence; Computer systems; Computer-assisted; Diagnosis; Information systems; Radiology

Year:  2020        PMID: 32458173      PMCID: PMC7476917          DOI: 10.1007/s00330-020-06946-y

Source DB:  PubMed          Journal:  Eur Radiol        ISSN: 0938-7994            Impact factor:   5.315


Introduction

Artificial intelligence (AI) is increasingly being recognized as an important application in clinical radiology [1-5]. Recent advances in machine learning have produced algorithms that allow automated and accurate detection and diagnosis of medical images. The large technological improvements have created high expectations among radiologists, healthcare providers, and policymakers alike. They promise considerable efficiency and quality gains for healthcare, for example by allowing more precise diagnosis and automating labor-intensive tasks, currently performed by radiologists [3, 6]. AI is expected to cause large changes in clinical work practices and requires complementary skills from radiologists [5, 7, 8]. A narrative of “radiologists becoming replaced by AI” has emerged, as discussions about this topic have flooded major conferences and scientific publications [8, 9]. Unsurprisingly, the replacement narrative has triggered strong responses within the radiology profession [1, 3, 10, 11]. While technical performance of AI applications is expected to increase continuously, its implementation in clinical radiology practice is rather complex and has so far been slow [6, 12, 13]. Earlier forms of AI applications, such as the first computerized aided diagnosis (CAD) systems, have failed to achieve widespread adoption. Literature has mainly blamed this on low technical performance of these early applications [5, 14–16], while other potential barriers to successful implementation, such as organizational or social aspects, have been largely ignored [17]. Technology implementation in hospital settings involves a large variety of stakeholders and organizational procedures, with strong routines and professional identities, as well as strict legal and regulatory standards [18-20]. Considering that AI applications in radiology are in an emerging phase, it is too early to evaluate their implementation [21]. However, in view of the unsuccessful widespread diffusion of earlier CAD systems, it can be assumed that AI applications will encounter barriers to implementation. We studied the facilitating and hindering factors for the successful implementation of AI applications in radiology departments, including not only technological but also organizational and social aspects.

Materials and methods

In this study, we use the Non-adoption, Abandonment, Scale-up, Spread, and Sustainability (NASSS) framework [20] to identify the success and failure factors in the implementation of AI applications in clinical radiology, thereby also focusing on the socio-organizational aspects. The NASSS framework aims to detect the determinants of implementation processes of complex technologies in healthcare on seven domains: the condition, technology, value proposition, the adopter system (patient, lay caregivers, individual technology user, and other staff), the organization, and the wider institutional and social context. The framework takes a dynamic perspective by following the interactions between these domains over time. Based on this framework, we created propositions on potential facilitating and hindering factors. We used an embedded multiple case study approach, investigating seven Dutch hospitals. Appendix A provides an overview of our interviewees and the cases, i.e. hospitals, they are involved in. We chose to focus on The Netherlands given the high pressure for cost savings in the health care sector, to which AI in radiology is expected to contribute significantly [22-24]. The Dutch radiology departments vary strongly in the number and types of internally available AI applications. These range from the detection and quantification of lung nodules in CT scans, mammography CAD systems, to stroke detection and automated bone-age assessment. We focused on those departments that used BoneXpert, a software-only medical device commercially distributed by Visiana since 2009. It runs automated bone maturity assessments based on X-rays of pediatric patients’ hands. BoneXpert is one of the first commercial applications of AI in radiology [25] and appeared to be the only AI application in clinical use across several hospitals in The Netherlands.1 Using a maximum variability logic, interviewees occupying different positions within the participating hospitals were selected (see Table 1 and Appendix A). Interviewees were contacted based on their experience with BoneXpert specifically or the implementation of AI applications for radiology more generally, as indicated by publicly available information or internal referral. The number of interviewees varied from one to four participants per hospital. Furthermore, four key informant interviews were used to investigate the external context of the cases, such as socio-economic and regulatory influences.
Table 1

Overview cases and interviewees

Number of interviewsRoles of interviewees
Cases (7 hospitals)
TKZ14Senior radiologist; legal consultant; clinical physicist; operational department manager
TKZ24Senior radiologist (2); junior technical physician; innovation manager
UMC14Senior radiologists (3); innovation manager
UMC23Junior radiologist (2); senior data scientist,
UMC33Senior radiologist (2); senior data scientist
UMC41Senior radiologist
AZ11Senior radiologist
Key informants
Professional organization1Member of management
Professional organization1Implementation advisor
Professional organization1Member of management
Imaging technology provider1Innovation manager
Total number of interviews24
Overview cases and interviewees From February to June 2019, 24 semi-structured interviews were conducted by a single researcher, complemented by a document analysis based on internal documents from the respective cases and publicly-available documents. Interviews were conducted until thematic saturation was reached, meaning when no new themes appeared during additional interviews. Twenty-one interviews were conducted face-to-face; three were held by telephone. Interviews were held in English if possible and lasted between 20 and 80 min. Oral permission for recording was granted by all interviewees. The interviews were subsequently transcribed and coded in NVivo. The concepts identified in the interviews were compared with the original NASSS framework, which was afterwards refined.

Results

We first present the facilitating factors, followed by the hindering factors for AI implementation in radiology.

Identified facilitating factors for AI implementation in radiology

Table 2 provides an overview of the number and identity of interviewees referencing the facilitating factors.
Table 2

Overview of facilitating factors for AI implementation

Facilitating factorsInterviewees (by interviewee ID, following Appendix A)Sum
Pressure on healthcare budgets4, 20, 19, 18, 225
Expected added benefit: improved diagnostic practice1, 2, 3, 4, 5, 6, 7, 8, 10, 11, 12, 13, 14, 16, 17, 20, 22, 2318
Expected added benefit: operational benefits1, 2, 3, 4, 5, 6, 7, 8, 10, 11, 12, 13, 14, 16, 20, 22, 2317
Easy integration in PACS2, 3, 5, 6, 7, 8, 9, 10, 11, 12, 13,14, 16, 20, 2215
Minimize workflow changes1, 2, 4, 5, 6, 7, 8, 10, 11, 12, 17, 14, 20, 2214
BoneXpert smooth integration in PACS1, 3, 5, 7, 8, 9, 10, 13, 14, 2210
Innovation strategy4/7 hospitals
Innovation manager3/7 hospitals
Local champions1, 2, 3, 8, 10, 12, 14, 17, 22, 23,10
Overview of facilitating factors for AI implementation First, pressure on healthcare budgets stimulates Dutch hospitals and radiology departments to develop and implement innovative technologies that promise efficiency and/or quality gains. The Dutch healthcare system is confronted with a constant rise in demand accompanied by a strong pressure to limit associated costs [27]. This context creates a favorable political context for AI applications. Second, although there is little empirical evidence, radiologists, members of hospital management, and technology developers expect a large added value of the AI applications in clinical practice. The interviewees mentioned two main benefits: (1) improved diagnostic practice due to more precise and objective diagnoses, avoidance of mistakes, and the automation of cumbersome tasks; and (2) operational benefits, such as diminishing workloads, time-saving, more consistent reporting across radiologists, and advanced service availability. Third, in order for AI applications to be perceived as user-friendly by radiologists, they need to be easily integrated into existing IT systems used by radiologists, such as picture archiving and communication systems (PACSs). This means that the output of the AI application should be displayed with the least possible clicks. Also, AI applications should be implemented without large changes to routines and workflow practices, i.e., by avoiding additional steps for reporting the result of the AI application. For example, the interviewed users experienced the integration of BoneXpert into the PACS as very smooth, being the main reason for its perceived user-friendliness. However, concerns remain about the integration of other AI applications into the PACSs. Fourth, the openness towards AI application in radiology is expressed by the adoption of hospital-wide or radiology department–specific innovation strategies. In four of the seven cases, a hospital-wide innovation strategy including AI was present, reflecting innovation leadership among the hospital management. Such leadership is also manifested by the presence or absence of a designated innovation manager (present in three of the seven cases). On radiology department-level, only one hospital had a formalized innovation strategy regarding AI. However, four more hospitals were developing such a strategic approach at the time of this research. Fifth, interviewees and document analysis [28, 29] show that local champions are vital in initiating and stimulating implementation within their department and taking the lead during the entire process. These local champions are radiologists that show a particularly strong interest in AI applications and usually have better than average understanding of the technical aspects of AI applications. To overcome the opposition of potential skeptical colleagues, local champions appear to follow two strategies: (1) providing general information on AI or on an AI application in particular through scientific articles, books, and presentations; and (2) promoting opportunities for experimentation with an application, e.g., by organizing showcases or installing a test version of the application. Both strategies aim to build trust and serve to familiarize and convince other radiologists (direct adopters) and referring clinicians (indirect adopters) with the AI application. Finally, the Radiological Society of The Netherlands (NVvR) serves as a knowledge-exchange platform among its members facilitating the implementation of AI applications. The NVvR has included AI in its strategic research agenda since 2017 [28] and has a “technology committee,” i.e., a study group that raises awareness among Dutch radiologists, e.g., by organizing regular open meetings, advising hospitals on the development of an AI strategy, and pursuing the inclusion of AI in the curriculum for future radiologists.

Identified hindering factors for AI implementation in radiology

Table 3 provides an overview of the number and identity of interviewees referencing the hindering factors. First, users perceived the technical performance of most AI applications as inconsistent. Technically, this refers to the algorithms’ performance, i.e., the sensitivity (number of false positives) and specificity (number of false negatives). In clinical terms, a large number of false positives create additional work for the radiologist, which was the case of earlier unsuccessful CAD applications. Having a large number of false negatives is even more dangerous, because it means that a potential lesion might get overlooked. Against a background of lacking technical understanding of AI, some radiologists are doubting the quality and safety of an application and fail to adopt or abandon AI applications. The interviews showed that computer science and programming knowledge required in the development of AI algorithms are not present-day competencies of radiologists. However, some technical understanding is imperative for quality and safety assessment and therefore create trust in the AI application’s reliability [8, 21, 30].
Table 3

Overview of hindering factors for AI implementation

Hindering factorsInterviewees (by interviewee ID, following Appendix A)Sum
Inconsistent technical performance3, 5, 6, 10, 11, 12, 13, 14, 22, 2310
Doubting quality and safety of the application2, 3, 4, 5, 6, 10, 11, 12, 13, 1410
Technical knowledge necessary1, 2, 3, 5, 8, 11, 12, 13, 14, 2010
Unstructured planning and monitoring2, 3, 5, 8, 9, 12, 14, 16, 17, 2210
Unstructured implementation in workflow3, 4, 5, 7, 9, 11, 12, 13, 16,9
Absence of guidelines/best practices3, 4, 5, 9, 12, 15, 16, 198
No empirical evidence on AI apps (validation)3, 4, 5, 8, 9, 12, 13, 20, 239
Uncertain funding1, 2, 3, 4, 6, 7, 8, 11, 12, 13, 14, 16, 18, 2214
Limited communication between departments6, 7, 9, 10, 17, 19, 20, 228
Inconsistent acceptance/trust of radiologists1, 2, 3, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 2214
Acceptance trust of referring clinicians1, 2, 4, 7, 10, 226
Inconsistent Acceptance of BoneXpert1, 3, 5, 7, 8, 9, 10, 13, 14, 2210
Reframe professional identity/responsibilities2, 3, 4, 5, 6, 7, 13, 228
Framing/narrative as co-pilot2, 3, 12, 14, 165
Regulatory and legal uncertainties3, 4, 8, 11, 13, 15, 17, 238
Reference to post-market surveillance MDR3, 4, 19, 20, 225
Legal responsibility for mistakes4, 8, 11, 15, 175
Overview of hindering factors for AI implementation Second, planning and monitoring of AI implementation tend to be unstructured. From an organizational perspective, clinical benefits or organizational goals that might be achieved by using AI applications are not clearly established ex-ante and therefore hard to assess after implementation. From a workflow perspective, implementation plans do not specify how the AI application should be integrated into the clinical workflow, which leads to significant variations in the way the application is used in different departments. Furthermore, in all cases, the work done to monitor existing practices or the impact of the implementation of novel technologies on the level of the hospital is currently limited. The unstructured nature of implementation processes can be explained by the absence of official guidelines or best practices. Third, there is a lack of empirical evidence on the effect of AI applications on the radiological workflow, as well as their added value for clinical radiology practice. One reason is that measuring clinical and organizational benefits of AI on a micro-level is difficult. There is, for example, no standard methodology to measure increases in the quality of diagnosis. When evidence on the technical performance is available (such as with BoneXpert), interviewees noted that publications on the validation of the algorithms are based on laboratory rather than clinical settings. Fourth, funding for AI applications is uncertain due to the lacking evidence on the added value of AI applications needed to back adoption and funding decisions. Moreover, the benefits and costs of using AI may be unequally divided over departments, which complicates funding decisions. If other departments are to cover parts of the additional costs, they need to learn about the technology and the potential benefits of using AI applications. This requires efficient communication between departments, a shortcoming in several of the studied cases. Fifth, the acceptance and trust of direct (radiologists) and indirect adopters (referring clinicians) in AI applications differ greatly. Radiologists’ perspectives on AI applications range from outright enthusiasm to curiosity, skepticism, and fear [31]. These differences in opinion across radiologists were also visible for BoneXpert. Interestingly, none of the interviewees expressed fear of being replaced by AI. Rather, interviewees mentioned the need to reframe their professional identity and responsibilities as a consequence of the arrival of AI applications. For example, they envision radiologists of the future as “imaging consultants” who play an active role in an interdisciplinary patient-focused hospital environment [28]. An important element in this reframing process is creating the “right” narrative around AI. To overcome resistance by radiologists, developers and hospital management are framing AI applications as “co-pilots” enabling radiologists to perform better while staying in control. Achieving acceptance of the referring clinicians is important since they are the potential final “customers” of the AI applications’ output. Interestingly, in three hospitals, we found that the referring clinicians did not trust the output of the AI application and redid a manual bone age analysis for every scan. Thus, just like the radiologists, the referring clinicians showed varying levels of acceptance of AI applications. Finally, lacking jurisprudence from the European General Data Protection Regulation and the new Medical Device Regulation (MDR), which will come into effect in May 2020, leads to several regulatory and legal uncertainties for AI applications in radiology [32, 33]. Currently, CE marks are granted without requiring proof of the performance and added benefit for clinical practice. The new MDR requires CE certification through a notified body and necessitates a large increase in requirements on quality, safety, and post-market surveillance. Additionally, our interviewees expressed concerns about the unresolved question of legal responsibility for damage occurred due to e.g. false negatives and false positives resulting from an AI-generated diagnosis [34]. Based on the empirical findings, the NASSS framework was adapted and refined for the case of the implementation of AI applications in radiology. The main adaptation of the original NASSS model to form the NASSS for AI in radiology concerns the “adopter system” (Fig. 1). It is elaborated with local champions, radiologists as direct adopters of the technology, and referring clinicians as indirect adopters.
Fig. 1

The NASSS framework [20], specified for AI applications in clinical radiology in The Netherlands

The NASSS framework [20], specified for AI applications in clinical radiology in The Netherlands

Discussion

This research contributes to the existing empirical evidence on the implementation challenges of AI-based medical technologies. We identified lacking acceptance as one of the most important causes for non-adoption, abandonment, and thus a barrier to the successful implementation of AI applications in radiology. The determinants of radiologists’ acceptance of AI application found in this study are in line with evidence from surveys among radiologists [2, 35] and radiology residents [31] and earlier evidence on the determinants of clinicians’ acceptance of computerized decision support systems (CDSS): insufficient knowledge [5, 6, 36], trust [36, 37], change in clinician’s professional identity, and professional autonomy [36, 38]. We found local champions to play a crucial role in overcoming lacking acceptance of technology users. The significance of having a local champion had previously appeared in research on the adoption of telehealth systems [39], as well as on the implementation of CDSS [36]. Notably, a recent study on the implementation of CDSS in US American radiology departments also identified local champions as an important facilitator for implementation [40]. Both studies mention the local champions’ facilitating role in starting and advancing the implementation processes of CDSS. Another implementation challenge we found, the role of evidence on innovation implementation, has been discussed extensively in the field of evidence-based healthcare [41]. Scientific evidence is an important determinant of innovation implementation for practitioners, a finding that also appears to hold for AI in radiology [8, 21, 41, 42]. It thus follows that AI applications for radiology reflect a trend in the field of medical imaging to engage with technologies that have yet to prove their promises of contributing to the improvement of the quality or efficiency of healthcare [43]. AI applications in radiology are predicted to not only support but also potentially automate certain medical decision processes, thereby calling into question the jobs of highly educated individuals. This element of job displacement due to automatization adds to the complexity of adoption and implementation processes in the field of health digitization. While the possibility of AI replacing radiologists, and thereby threatening their professional identity, was extensively mentioned in many recent radiology publications [5, 9, 16, 44], none of the interviewees in this research identified with this threat. This aligns with recent opinion surveys conducted among radiologists [2, 31, 35]. Across the healthcare field, radiologists already have the most digitized work environment [44] and they self-identify as logical frontrunners for using digitized supporting tools in their daily practice. In order to take on a leading role in the implementation of AI applications within the hospital, radiologists need to acquire AI literacy through complementary training [5-7]. Due to its exploratory nature and qualitative empirical approach, several limitations of the research need to be taken into consideration. This research only focuses on AI applications in Dutch radiology departments and could not be generalized to other healthcare systems. Across the cases, individuals with different roles and positions were interviewed, limiting the generalizability of the results to other hospitals in The Netherlands and beyond. Interviewees varied with regard to their related experience and knowledge, due to the early stage of implementation of AI in radiology practice. Therefore, the sample of interviewees is possibly biased towards individuals with a particular interest and above-average positive attitude towards AI applications. In order to achieve better generalizability of the results, further research should investigate applications that present higher technical complexity than BoneXpert and represent a larger part of the diagnostic work done by radiologists. Furthermore, it is important to understand how country-specific political and social contexts determine the implementation processes. Future studies can identify specific technical challenges for the implementation of AI applications, e.g., datasets and associated requirements (their size, algorithms, and data heterogeneity). Additionally, future research should focus on the impact of the implementation of AI applications on the knowledge development of radiologists.

Conclusion

Considering the great attention AI applications are receiving in radiology and other medical disciplines like pathology, understanding the barriers of and facilitators for the implementation of AI is important. One of the important facilitating factors is the presence of a “local champion,” an individual with a strong personal interest in AI applications who most often initiates and actively advances AI implementation in the organization. Among the most prominent hindering factors is the uncertain added value for clinical practice, which causes low acceptance of AI applications among adopters and complicates the mobilization of funds to acquire AI applications. Furthermore, the failure to include all relevant stakeholders in the planning, execution, and monitoring phase of the implementation of AI applications was found to be a major hindering factor. To increase the acceptance among adopters, more evidence of the added benefit of their AI applications in the clinical setting is needed. Also, all involved stakeholders (most notably radiologists and referring clinicians) should be included in the decisions for and the design of implementation processes of AI applications. (DOCX 24 kb)
  35 in total

Review 1.  Computer-aided diagnosis: how to move from the laboratory to the clinic.

Authors:  Bram van Ginneken; Cornelia M Schaefer-Prokop; Mathias Prokop
Journal:  Radiology       Date:  2011-12       Impact factor: 11.105

2.  Impact of the rise of artificial intelligence in radiology: What do radiologists think?

Authors:  Q Waymel; S Badr; X Demondion; A Cotten; T Jacques
Journal:  Diagn Interv Imaging       Date:  2019-05-06       Impact factor: 4.026

3.  Adapting to Artificial Intelligence: Radiologists and Pathologists as Information Specialists.

Authors:  Saurabh Jha; Eric J Topol
Journal:  JAMA       Date:  2016-12-13       Impact factor: 56.272

4.  Medical students' attitude towards artificial intelligence: a multicentre survey.

Authors:  D Pinto Dos Santos; D Giese; S Brodehl; S H Chon; W Staab; R Kleinert; D Maintz; B Baeßler
Journal:  Eur Radiol       Date:  2018-07-06       Impact factor: 5.315

Review 5.  The future of radiology augmented with Artificial Intelligence: A strategy for success.

Authors:  Charlene Liew
Journal:  Eur J Radiol       Date:  2018-03-14       Impact factor: 3.528

6.  Artificial intelligence in radiology: the ecosystem essential to improving patient care.

Authors:  Julie Sogani; Bibb Allen; Keith Dreyer; Geraldine McGinty
Journal:  Clin Imaging       Date:  2019-08-31       Impact factor: 1.605

Review 7.  The practical implementation of artificial intelligence technologies in medicine.

Authors:  Jianxing He; Sally L Baxter; Jie Xu; Jiming Xu; Xingtao Zhou; Kang Zhang
Journal:  Nat Med       Date:  2019-01-07       Impact factor: 53.440

8.  What hinders the uptake of computerized decision support systems in hospitals? A qualitative study and framework for implementation.

Authors:  Elisa G Liberati; Francesca Ruggiero; Laura Galuppo; Mara Gorli; Marien González-Lorenzo; Marco Maraldi; Pietro Ruggieri; Hernan Polo Friz; Giuseppe Scaratti; Koren H Kwag; Roberto Vespignani; Lorenzo Moja
Journal:  Implement Sci       Date:  2017-09-15       Impact factor: 7.327

9.  A Human(e) Factor in Clinical Decision Support Systems.

Authors:  Tim Bezemer; Mark Ch de Groot; Enja Blasse; Maarten J Ten Berg; Teus H Kappen; Annelien L Bredenoord; Wouter W van Solinge; Imo E Hoefer; Saskia Haitjema
Journal:  J Med Internet Res       Date:  2019-03-19       Impact factor: 5.428

10.  A qualitative framework-based evaluation of radiology clinical decision support initiatives: eliciting key factors to physician adoption in implementation.

Authors:  Laura Haak Marcial; Douglas S Johnston; Michael R Shapiro; Sara R Jacobs; Barry Blumenfeld; Lucia Rojas Smith
Journal:  JAMIA Open       Date:  2019-02-22
View more
  15 in total

Review 1.  [Structured reporting and artificial intelligence].

Authors:  Johann-Martin Hempel; Daniel Pinto Dos Santos
Journal:  Radiologe       Date:  2021-10-04       Impact factor: 0.635

2.  How do providers of artificial intelligence (AI) solutions propose and legitimize the values of their solutions for supporting diagnostic radiology workflow? A technography study in 2021.

Authors:  Mohammad H Rezazade Mehrizi; Simon H Gerritsen; Wouter M de Klerk; Chantal Houtschild; Silke M H Dinnessen; Luna Zhao; Rik van Sommeren; Abby Zerfu
Journal:  Eur Radiol       Date:  2022-08-18       Impact factor: 7.034

3.  Forging Connections in Latin America to Advance AI in Radiology.

Authors:  Felipe Campos Kitamura; Felipe Barjud Pereira do Nascimento; Guillermo Elizondo-Riojas; Hernán Chaves; Héctor Henríquez Leighton; Emmanuel Salinas-Miranda; Thiago Júlio; Antônio José da Rocha; César Higa Nomura
Journal:  Radiol Artif Intell       Date:  2022-08-31

Review 4.  Artificial Intelligence Applications in Health Care Practice: Scoping Review.

Authors:  Malvika Sharma; Carl Savage; Monika Nair; Ingrid Larsson; Petra Svedberg; Jens M Nygren
Journal:  J Med Internet Res       Date:  2022-10-05       Impact factor: 7.076

5.  Perspective of Information Technology Decision Makers on Factors Influencing Adoption and Implementation of Artificial Intelligence Technologies in 40 German Hospitals: Descriptive Analysis.

Authors:  Lina Weinert; Julia Müller; Laura Svensson; Oliver Heinze
Journal:  JMIR Med Inform       Date:  2022-06-15

6.  An international survey on AI in radiology in 1041 radiologists and radiology residents part 2: expectations, hurdles to implementation, and education.

Authors:  Merel Huisman; Erik Ranschaert; William Parker; Domenico Mastrodicasa; Martin Koci; Daniel Pinto de Santos; Francesca Coppola; Sergey Morozov; Marc Zins; Cedric Bohyn; Ural Koç; Jie Wu; Satyam Veean; Dominik Fleischmann; Tim Leiner; Martin J Willemink
Journal:  Eur Radiol       Date:  2021-05-11       Impact factor: 5.315

7.  Radiologists in the loop: the roles of radiologists in the development of AI applications.

Authors:  Damian Scheek; Mohammad H Rezazade Mehrizi; Erik Ranschaert
Journal:  Eur Radiol       Date:  2021-04-16       Impact factor: 5.315

8.  Factors Influencing Clinician Trust in Predictive Clinical Decision Support Systems for In-Hospital Deterioration: Qualitative Descriptive Study.

Authors:  Jessica M Schwartz; Maureen George; Sarah Collins Rossetti; Patricia C Dykes; Simon R Minshall; Eugene Lucas; Kenrick D Cato
Journal:  JMIR Hum Factors       Date:  2022-05-12

Review 9.  The Added Effect of Artificial Intelligence on Physicians' Performance in Detecting Thoracic Pathologies on CT and Chest X-ray: A Systematic Review.

Authors:  Dana Li; Lea Marie Pehrson; Carsten Ammitzbøl Lauridsen; Lea Tøttrup; Marco Fraccaro; Desmond Elliott; Hubert Dariusz Zając; Sune Darkner; Jonathan Frederik Carlsen; Michael Bachmann Nielsen
Journal:  Diagnostics (Basel)       Date:  2021-11-26

10.  The Impact of Artificial Intelligence on Waiting Time for Medical Care in an Urgent Care Service for COVID-19: Single-Center Prospective Study.

Authors:  Kaio Jia Bin; Adler Araujo Ribeiro Melo; José Guilherme Moraes Franco da Rocha; Renata Pivi de Almeida; Vilson Cobello Junior; Fernando Liebhart Maia; Elizabeth de Faria; Antonio José Pereira; Linamara Rizzo Battistella; Suzane Kioko Ono
Journal:  JMIR Form Res       Date:  2022-02-01
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.