Literature DB >> 29553930

Saliency in VR: How Do People Explore Virtual Environments?

Vincent Sitzmann, Ana Serrano, Amy Pavel, Maneesh Agrawala, Diego Gutierrez, Belen Masia, Gordon Wetzstein.   

Abstract

Understanding how people explore immersive virtual environments is crucial for many applications, such as designing virtual reality (VR) content, developing new compression algorithms, or learning computational models of saliency or visual attention. Whereas a body of recent work has focused on modeling saliency in desktop viewing conditions, VR is very different from these conditions in that viewing behavior is governed by stereoscopic vision and by the complex interaction of head orientation, gaze, and other kinematic constraints. To further our understanding of viewing behavior and saliency in VR, we capture and analyze gaze and head orientation data of 169 users exploring stereoscopic, static omni-directional panoramas, for a total of 1980 head and gaze trajectories for three different viewing conditions. We provide a thorough analysis of our data, which leads to several important insights, such as the existence of a particular fixation bias, which we then use to adapt existing saliency predictors to immersive VR conditions. In addition, we explore other applications of our data and analysis, including automatic alignment of VR video cuts, panorama thumbnails, panorama video synopsis, and saliency-basedcompression.

Entities:  

Mesh:

Year:  2018        PMID: 29553930     DOI: 10.1109/TVCG.2018.2793599

Source DB:  PubMed          Journal:  IEEE Trans Vis Comput Graph        ISSN: 1077-2626            Impact factor:   4.579


  9 in total

1.  Towards Making Videos Accessible for Low Vision Screen Magnifier Users.

Authors:  Ali Selman Aydin; Shirin Feiz; Vikas Ashok; I V Ramakrishnan
Journal:  IUI       Date:  2020-03

2.  Saliency-Aware Subtle Augmentation Improves Human Visual Search Performance in VR.

Authors:  Olga Lukashova-Sanz; Siegfried Wahl
Journal:  Brain Sci       Date:  2021-02-25

3.  All-passive transformable optical mapping near-eye display.

Authors:  Wei Cui; Liang Gao
Journal:  Sci Rep       Date:  2019-04-15       Impact factor: 4.379

4.  Larger visual changes compress time: The inverted effect of asemantic visual features on interval time perception.

Authors:  Sandra Malpica; Belen Masia; Laura Herman; Gordon Wetzstein; David M Eagleman; Diego Gutierrez; Zoya Bylinskii; Qi Sun
Journal:  PLoS One       Date:  2022-03-22       Impact factor: 3.240

5.  Target Eccentricity and Form Influences Disparity Vergence Eye Movements Responses: A Temporal and Dynamic Analysis.

Authors:  Chang Yaramothu; Rajbir S Jaswal; Tara L Alvarez
Journal:  J Eye Mov Res       Date:  2019-12-09       Impact factor: 0.957

6.  Gravitational models explain shifts on human visual attention.

Authors:  Dario Zanca; Marco Gori; Stefano Melacci; Alessandra Rufa
Journal:  Sci Rep       Date:  2020-10-01       Impact factor: 4.379

7.  Active vision in immersive, 360° real-world environments.

Authors:  Amanda J Haskins; Jeff Mentch; Thomas L Botch; Caroline E Robertson
Journal:  Sci Rep       Date:  2020-08-31       Impact factor: 4.379

8.  Aspects of visual avatar appearance: self-representation, display type, and uncanny valley.

Authors:  Daniel Hepperle; Christian Felix Purps; Jonas Deuchler; Matthias Wölfel
Journal:  Vis Comput       Date:  2021-06-17       Impact factor: 2.835

9.  What are the visuo-motor tendencies of omnidirectional scene free-viewing in virtual reality?

Authors:  Erwan Joël David; Pierre Lebranchu; Matthieu Perreira Da Silva; Patrick Le Callet
Journal:  J Vis       Date:  2022-03-02       Impact factor: 2.240

  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.