Literature DB >> 17915596

Effects of information source, pedigree, and reliability on operator interaction with decision support systems.

Poornima Madhavan1, Douglas A Wiegmann.   

Abstract

OBJECTIVE: Two experiments are described that examined operators' perceptions of decision aids.
BACKGROUND: Research has suggested certain biases against automation that influence human interaction with automation. We differentiated preconceived biases from post hoc biases and examined their effects on advice acceptance.
METHOD: In Study 1 we examined operators' trust in and perceived reliability of humans versus automation of varying pedigree (expert vs. novice), based on written descriptions of these advisers prior to operators' interacting with these advisers. In Study 2 we examined participants' post hoc trust in, perceived reliability of, and dependence on these advisers after their objective experience of advisers' reliability (90% vs. 70%) in a luggage-screening task.
RESULTS: In Study 1 measures of perceived reliability indicated that automation was perceived as more reliable than humans across pedigrees. Measures of trust indicated that automated "novices" were trusted more than human "novices"; human "experts" were trusted more than automated "experts." In Study 2, perceived reliability varied as a function of pedigree, whereas subjective trust was always higher for automation than for humans. Advice acceptance from novice automation was always higher than from novice humans. However, when advisers were 70% reliable, errors generated by expert automation led to a drop in compliance/reliance on expert automation relative to expert humans.
CONCLUSION: Preconceived expectations of automation influence the use of these aids in actual tasks. APPLICATION: The results provide a reference point for deriving indices of "optimal" user interaction with decision aids and for developing frameworks of trust in decision support systems.

Entities:  

Mesh:

Year:  2007        PMID: 17915596     DOI: 10.1518/001872007X230154

Source DB:  PubMed          Journal:  Hum Factors        ISSN: 0018-7208            Impact factor:   2.888


  8 in total

Review 1.  Automation bias: a systematic review of frequency, effect mediators, and mitigators.

Authors:  Kate Goddard; Abdul Roudsari; Jeremy C Wyatt
Journal:  J Am Med Inform Assoc       Date:  2011-06-16       Impact factor: 4.497

2.  Understanding the effect of workload on automation use for younger and older adults.

Authors:  Sara E McBride; Wendy A Rogers; Arthur D Fisk
Journal:  Hum Factors       Date:  2011-12       Impact factor: 2.888

3.  Advice Taking from Humans and Machines: An fMRI and Effective Connectivity Study.

Authors:  Kimberly Goodyear; Raja Parasuraman; Sergey Chernyak; Poornima Madhavan; Gopikrishna Deshpande; Frank Krueger
Journal:  Front Hum Neurosci       Date:  2016-11-04       Impact factor: 3.169

4.  Feedback and Direction Sources Influence Navigation Decision Making on Experienced Routes.

Authors:  Yu Li; Weijia Li; Yingying Yang; Qi Wang
Journal:  Front Psychol       Date:  2019-09-13

Review 5.  The Next Generation of Medical Decision Support: A Roadmap Toward Transparent Expert Companions.

Authors:  Sebastian Bruckert; Bettina Finzel; Ute Schmid
Journal:  Front Artif Intell       Date:  2020-09-24

6.  Challenging presumed technological superiority when working with (artificial) colleagues.

Authors:  Tobias Rieger; Eileen Roesler; Dietrich Manzey
Journal:  Sci Rep       Date:  2022-03-08       Impact factor: 4.379

7.  Learning From the Slips of Others: Neural Correlates of Trust in Automated Agents.

Authors:  Ewart J de Visser; Paul J Beatty; Justin R Estepp; Spencer Kohn; Abdulaziz Abubshait; John R Fedota; Craig G McDonald
Journal:  Front Hum Neurosci       Date:  2018-08-10       Impact factor: 3.169

8.  Automated Systems and Trust: Mineworkers' Trust in Proximity Detection Systems for Mobile Machines.

Authors:  LaTasha R Swanson; Jennica L Bellanca; Justin Helton
Journal:  Saf Health Work       Date:  2019-09-25
  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.