| Literature DB >> 33693418 |
Remy Kusters1, Dusan Misevic1, Hugues Berry2, Antoine Cully3, Yann Le Cunff4, Loic Dandoy1, Natalia Díaz-Rodríguez5,6, Marion Ficher1, Jonathan Grizou1, Alice Othmani7, Themis Palpanas8, Matthieu Komorowski3, Patrick Loiseau9, Clément Moulin Frier5, Santino Nanini1, Daniele Quercia10, Michele Sebag11, Françoise Soulié Fogelman12, Sofiane Taleb1, Liubov Tupikina1,13, Vaibhav Sahu1, Jill-Jênn Vie14, Fatima Wehbi1.
Abstract
The use of artificial intelligence (AI) in a variety of research fields is speeding up multiple digital revolutions, from shifting paradigms in healthcare, precision medicine and wearable sensing, to public services and education offered to the masses around the world, to future cities made optimally efficient by autonomous driving. When a revolution happens, the consequences are not obvious straight away, and to date, there is no uniformly adapted framework to guide AI research to ensure a sustainable societal transition. To answer this need, here we analyze three key challenges to interdisciplinary AI research, and deliver three broad conclusions: 1) future development of AI should not only impact other scientific domains but should also take inspiration and benefit from other fields of science, 2) AI research must be accompanied by decision explainability, dataset bias transparency as well as development of evaluation methodologies and creation of regulatory agencies to ensure responsibility, and 3) AI education should receive more attention, efforts and innovation from the educational and scientific communities. Our analysis is of interest not only to AI practitioners but also to other researchers and the general public as it offers ways to guide the emerging collaborations and interactions toward the most fruitful outcomes.Entities:
Keywords: artificial intelligence; auditability; education; ethics; interdisciplinary science; interpretability
Year: 2020 PMID: 33693418 PMCID: PMC7931862 DOI: 10.3389/fdata.2020.577974
Source DB: PubMed Journal: Front Big Data ISSN: 2624-909X