Literature DB >> 29071352

Characterizing eye movement behaviors and kinematics of non-human primates during virtual navigation tasks.

Benjamin W Corrigan1,2, Roberto A Gulli1,2, Guillaume Doucet2,3, Julio C Martinez-Trujillo4,5,6.   

Abstract

Virtual environments (VE) allow testing complex behaviors in naturalistic settings by combining highly controlled visual stimuli with spatial navigation and other cognitive tasks. They also allow for the recording of eye movements using high-precision eye tracking techniques, which is important in electrophysiological studies examining the response properties of neurons in visual areas of nonhuman primates. However, during virtual navigation, the pattern of retinal stimulation can be highly dynamic which may influence eye movements. Here we examine whether and how eye movement patterns change as a function of dynamic visual stimulation during virtual navigation tasks, relative to standard oculomotor tasks. We trained two rhesus macaques to use a joystick to navigate in a VE to complete two tasks. To contrast VE behavior with classic measurements, the monkeys also performed a simple Cued Saccade task. We used a robust algorithm for rapid classification of saccades, fixations, and smooth pursuits. We then analyzed the kinematics of saccades during all tasks, and specifically during different phases of the VE tasks. We found that fixation to smooth pursuit ratios were smaller in VE tasks (4:5) compared to the Cued Saccade task (7:1), reflecting a more intensive use of smooth pursuit to foveate targets in VE than in a standard visually guided saccade task or during spontaneous fixations. Saccades made to rewarded targets (exploitation) tended to have increased peak velocities compared to saccades made to unrewarded objects (exploration). VE exploitation saccades were 6% slower than saccades to discrete targets in the Cued Saccade task. Virtual environments represent a technological advance in experimental design for nonhuman primates. Here we provide a framework to study the ways that eye movements change between and within static and dynamic displays.

Entities:  

Mesh:

Year:  2017        PMID: 29071352     DOI: 10.1167/17.12.15

Source DB:  PubMed          Journal:  J Vis        ISSN: 1534-7362            Impact factor:   2.240


  5 in total

1.  The application of noninvasive, restraint-free eye-tracking methods for use with nonhuman primates.

Authors:  Lydia M Hopper; Roberto A Gulli; Lauren H Howard; Fumihiro Kano; Christopher Krupenye; Amy M Ryan; Annika Paukner
Journal:  Behav Res Methods       Date:  2021-06

2.  Context-dependent representations of objects and space in the primate hippocampus during virtual navigation.

Authors:  Roberto A Gulli; Lyndon R Duong; Benjamin W Corrigan; Guillaume Doucet; Sylvain Williams; Stefano Fusi; Julio C Martinez-Trujillo
Journal:  Nat Neurosci       Date:  2019-12-23       Impact factor: 24.884

3.  Differential Generation of Saccade, Fixation, and Image-Onset Event-Related Potentials in the Human Mesial Temporal Lobe.

Authors:  Chaim N Katz; Kramay Patel; Omid Talakoub; David Groppe; Kari Hoffman; Taufik A Valiante
Journal:  Cereb Cortex       Date:  2020-09-03       Impact factor: 5.357

4.  Applying TS-DBN model into sports behavior recognition with deep learning approach.

Authors:  Yingqing Guo; Xin Wang
Journal:  J Supercomput       Date:  2021-04-06       Impact factor: 2.474

5.  Chimpanzees (Pan troglodytes) navigate to find hidden fruit in a virtual environment.

Authors:  Matthias Allritz; Josep Call; Ken Schweller; Emma S McEwen; Miguel de Guinea; Karline R L Janmaat; Charles R Menzel; Francine L Dolins
Journal:  Sci Adv       Date:  2022-06-24       Impact factor: 14.957

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.