Literature DB >> 23156624

Trust in smart systems: sharing driving goals and giving information to increase trustworthiness and acceptability of smart systems in cars.

Frank M F Verberne1, Jaap Ham, Cees J H Midden.   

Abstract

OBJECTIVE: We examine whether trust in smart systems is generated analogously to trust in humans and whether the automation level of smart systems affects trustworthiness and acceptability of those systems.
BACKGROUND: Trust is an important factor when considering acceptability of automation technology. As shared goals lead to social trust, and intelligent machines tend to be treated like humans, the authors expected that shared driving goals would also lead to increased trustworthiness and acceptability of adaptive cruise control (ACC) systems.
METHOD: In an experiment, participants (N = 57) were presented with descriptions of three ACCs with different automation levels that were described as systems that either shared their driving goals or did not. Trustworthiness and acceptability of all the ACCs were measured.
RESULTS: ACCs sharing the driving goals of the user were more trustworthy and acceptable than were ACCs not sharing the driving goals of the user. Furthermore, ACCs that took over driving tasks while providing information were more trustworthy and acceptable than were ACCs that took over driving tasks without providing information. Trustworthiness mediated the effects of both driving goals and automation level on acceptability of ACCs.
CONCLUSION: As when trusting other humans, trusting smart systems depends on those systems sharing the user's goals. Furthermore, based on their description, smart systems that take over tasks are judged more trustworthy and acceptable when they also provide information. APPLICATION: For optimal acceptability of smart systems, goals of the user should be shared by the smart systems, and smart systems should provide information to their user.

Entities:  

Mesh:

Year:  2012        PMID: 23156624     DOI: 10.1177/0018720812443825

Source DB:  PubMed          Journal:  Hum Factors        ISSN: 0018-7208            Impact factor:   2.888


  6 in total

1.  Research on the use intention of potential designers of unmanned cars based on technology acceptance model.

Authors:  Tianyang Huang
Journal:  PLoS One       Date:  2021-08-20       Impact factor: 3.240

2.  Pedestrian Trust in Automated Vehicles: Role of Traffic Signal and AV Driving Behavior.

Authors:  Suresh Kumaar Jayaraman; Chandler Creech; Dawn M Tilbury; X Jessie Yang; Anuj K Pradhan; Katherine M Tsui; Lionel P Robert
Journal:  Front Robot AI       Date:  2019-11-28

3.  Professional decision making with digitalisation of patient contacts in a medical advice setting: a qualitative study of a pilot project with a chat programme in Sweden.

Authors:  Åsa Cajander; Gustaf Hedström; Sofia Leijon; Marta Larusdottir
Journal:  BMJ Open       Date:  2021-12-02       Impact factor: 2.692

4.  Multi-device trust transfer: Can trust be transferred among multiple devices?

Authors:  Kohei Okuoka; Kouichi Enami; Mitsuhiko Kimoto; Michita Imai
Journal:  Front Psychol       Date:  2022-08-03

Review 5.  From Trust in Automation to Decision Neuroscience: Applying Cognitive Neuroscience Methods to Understand and Improve Interaction Decisions Involved in Human Automation Interaction.

Authors:  Kim Drnec; Amar R Marathe; Jamie R Lukos; Jason S Metcalfe
Journal:  Front Hum Neurosci       Date:  2016-06-30       Impact factor: 3.169

6.  The effects of personality and locus of control on trust in humans versus artificial intelligence.

Authors:  Navya Nishith Sharan; Daniela Maria Romano
Journal:  Heliyon       Date:  2020-08-28
  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.