Literature DB >> 18689061

Humans: still vital after all these years of automation.

Raja Parasuraman1, Christopher D Wickens.   

Abstract

OBJECTIVE: The authors discuss empirical studies of human-automation interaction and their implications for automation design.
BACKGROUND: Automation is prevalent in safety-critical systems and increasingly in everyday life. Many studies of human performance in automated systems have been conducted over the past 30 years.
METHODS: Developments in three areas are examined: levels and stages of automation, reliance on and compliance with automation, and adaptive automation.
RESULTS: Automation applied to information analysis or decision-making functions leads to differential system performance benefits and costs that must be considered in choosing appropriate levels and stages of automation. Human user dependence on automated alerts and advisories reflects two components of operator trust, reliance and compliance, which are in turn determined by the threshold designers use to balance automation misses and false alarms. Finally, adaptive automation can provide additional benefits in balancing workload and maintaining the user's situation awareness, although more research is required to identify when adaptation should be user controlled or system driven.
CONCLUSIONS: The past three decades of empirical research on humans and automation has provided a strong science base that can be used to guide the design of automated systems. APPLICATION: This research can be applied to most current and future automated systems.

Entities:  

Mesh:

Year:  2008        PMID: 18689061     DOI: 10.1518/001872008X312198

Source DB:  PubMed          Journal:  Hum Factors        ISSN: 0018-7208            Impact factor:   2.888


  14 in total

1.  Toward a framework for levels of robot autonomy in human-robot interaction.

Authors:  Jenay M Beer; Arthur D Fisk; Wendy A Rogers
Journal:  J Hum Robot Interact       Date:  2014-07

2.  Prospective memory in an air traffic control simulation: external aids that signal when to act.

Authors:  Shayne Loft; Rebekah E Smith; Adella Bhaskara
Journal:  J Exp Psychol Appl       Date:  2011-03

3.  Understanding human management of automation errors.

Authors:  Sara E McBride; Wendy A Rogers; Arthur D Fisk
Journal:  Theor Issues Ergon Sci       Date:  2014

4.  How different types of users develop trust in technology: a qualitative analysis of the antecedents of active and passive user trust in a shared technology.

Authors:  Jie Xu; Kim Le; Annika Deitermann; Enid Montague
Journal:  Appl Ergon       Date:  2014-05-29       Impact factor: 3.661

5.  Alterations in cognitive performance during passive hyperthermia are task dependent.

Authors:  Nadia Gaoua; Sebastien Racinais; Justin Grantham; Farid El Massioui
Journal:  Int J Hyperthermia       Date:  2010-11-11       Impact factor: 3.914

Review 6.  Human-Autonomy Teaming: A Review and Analysis of the Empirical Literature.

Authors:  Thomas O'Neill; Nathan McNeese; Amy Barron; Beau Schelble
Journal:  Hum Factors       Date:  2020-10-22       Impact factor: 3.598

7.  Looking for Age Differences in Self-Driving Vehicles: Examining the Effects of Automation Reliability, Driving Risk, and Physical Impairment on Trust.

Authors:  Ericka Rovira; Anne Collins McLaughlin; Richard Pak; Luke High
Journal:  Front Psychol       Date:  2019-04-26

8.  A Framework for Evaluating Field-Based, High-Throughput Phenotyping Systems: A Meta-Analysis.

Authors:  Sierra N Young
Journal:  Sensors (Basel)       Date:  2019-08-17       Impact factor: 3.576

9.  The influence of spatial ability and experience on performance during spaceship rendezvous and docking.

Authors:  Xiaoping Du; Yijing Zhang; Yu Tian; Weifen Huang; Bin Wu; Jingyu Zhang
Journal:  Front Psychol       Date:  2015-07-15

Review 10.  Autopilot, Mind Wandering, and the Out of the Loop Performance Problem.

Authors:  Jonas Gouraud; Arnaud Delorme; Bruno Berberian
Journal:  Front Neurosci       Date:  2017-10-05       Impact factor: 4.677

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.