Literature DB >> 33420150

Drivers use active gaze to monitor waypoints during automated driving.

Callum Mole1, Jami Pekkanen1,2,3, William E A Sheppard1, Gustav Markkula3, Richard M Wilkie4.   

Abstract

Automated vehicles (AVs) will change the role of the driver, from actively controlling the vehicle to primarily monitoring it. Removing the driver from the control loop could fundamentally change the way that drivers sample visual information from the scene, and in particular, alter the gaze patterns generated when under AV control. To better understand how automation affects gaze patterns this experiment used tightly controlled experimental conditions with a series of transitions from 'Manual' control to 'Automated' vehicle control. Automated trials were produced using either a 'Replay' of the driver's own steering trajectories or standard 'Stock' trials that were identical for all participants. Gaze patterns produced during Manual and Automated conditions were recorded and compared. Overall the gaze patterns across conditions were very similar, but detailed analysis shows that drivers looked slightly further ahead (increased gaze time headway) during Automation with only small differences between Stock and Replay trials. A novel mixture modelling method decomposed gaze patterns into two distinct categories and revealed that the gaze time headway increased during Automation. Further analyses revealed that while there was a general shift to look further ahead (and fixate the bend entry earlier) when under automated vehicle control, similar waypoint-tracking gaze patterns were produced during Manual driving and Automation. The consistency of gaze patterns across driving modes suggests that active-gaze models (developed for manual driving) might be useful for monitoring driver engagement during Automated driving, with deviations in gaze behaviour from what would be expected during manual control potentially indicating that a driver is not closely monitoring the automated system.

Entities:  

Year:  2021        PMID: 33420150      PMCID: PMC7794576          DOI: 10.1038/s41598-020-80126-2

Source DB:  PubMed          Journal:  Sci Rep        ISSN: 2045-2322            Impact factor:   4.996


  55 in total

Review 1.  In what ways do eye movements contribute to everyday activities?

Authors:  M F Land; M Hayhoe
Journal:  Vision Res       Date:  2001       Impact factor: 1.886

2.  Predictable eye-head coordination during driving.

Authors:  M F Land
Journal:  Nature       Date:  1992-09-24       Impact factor: 49.962

3.  Using vision to control locomotion: looking where you want to go.

Authors:  R M Wilkie; G K Kountouriotis; N Merat; J P Wann
Journal:  Exp Brain Res       Date:  2010-06-17       Impact factor: 1.972

4.  Does gaze influence steering around a bend?

Authors:  Katherine D Robertshaw; Richard M Wilkie
Journal:  J Vis       Date:  2008-04-23       Impact factor: 2.240

5.  Getting Back Into the Loop: The Perceptual-Motor Determinants of Successful Transitions out of Automated Driving.

Authors:  Callum D Mole; Otto Lappi; Oscar Giles; Gustav Markkula; Franck Mars; Richard M Wilkie
Journal:  Hum Factors       Date:  2019-03-06       Impact factor: 2.888

6.  Were they in the loop during automated driving? Links between visual attention and crash potential.

Authors:  Tyron Louw; Ruth Madigan; Oliver Carsten; Natasha Merat
Journal:  Inj Prev       Date:  2016-09-21       Impact factor: 2.399

7.  Predicting human visuomotor behaviour in a driving task.

Authors:  Leif Johnson; Brian Sullivan; Mary Hayhoe; Dana Ballard
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2014-01-06       Impact factor: 6.237

8.  Pursuit eye-movements in curve driving differentiate between future path and tangent point models.

Authors:  Otto Lappi; Jami Pekkanen; Teemu H Itkonen
Journal:  PLoS One       Date:  2013-07-22       Impact factor: 3.240

9.  Sustained sensorimotor control as intermittent decisions about prediction errors: computational framework and application to ground vehicle steering.

Authors:  Gustav Markkula; Erwin Boer; Richard Romano; Natasha Merat
Journal:  Biol Cybern       Date:  2018-02-16       Impact factor: 2.086

10.  Humans use Optokinetic Eye Movements to Track Waypoints for Steering.

Authors:  Otto Lappi; Jami Pekkanen; Paavo Rinkkala; Samuel Tuhkanen; Ari Tuononen; Juho-Pekka Virtanen
Journal:  Sci Rep       Date:  2020-03-06       Impact factor: 4.379

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.