| Literature DB >> 27766314 |
Martin Dobricki1, Paul Pauli2.
Abstract
Almost all living species regularly explore environments that they experience as pleasant, aversive, arousing or frightening. We postulate that such exploratory behavior and emotional experience both are regulated based on the interdependent perception of one's body and stimuli that collectively define a spatial context such as a cliff. Here we examined this by testing if the interaction of the sensory input on one's gait and the sensory input on the spatial context is modulating both the emotional experience of the environment and its exploration through head motion. To this end, we asked healthy humans to explore a life-sized Virtual Reality simulation of a forest glade by physically walking around in this environment on two narrow rectangular platforms connected by a plank. The platforms and the plank were presented such that they were either placed on ground or on the top of two high bridge piers. Hence, the forest glade was presented either as a "ground" or as a "height" context. Within these two spatial contexts the virtual plank was projected either on the rigid physical floor or onto a bouncy physical plank. Accordingly, the gait of our participants while they crossed the virtual plank was either "smooth" or "bouncy." We found that in the height context bouncy gait compared to smooth gait increased the orientation of the head below the horizon and intensified the experience of the environment as negative. Whereas, within the ground context bouncy gait increased the orientation of the head towards and above the horizon and made that the environment was experienced as positive. Our findings suggest that the brain of healthy humans is using the interaction of the sensory input on their gait and the sensory input on the spatial context to regulate both the emotional experience of the environment and its exploration through head motion.Entities:
Keywords: Neuroscience; Psychology
Year: 2016 PMID: 27766314 PMCID: PMC5067243 DOI: 10.1016/j.heliyon.2016.e00173
Source DB: PubMed Journal: Heliyon ISSN: 2405-8440
Fig. 1The experimental setup. (A) Outside view of the Cave Automatic Virtual Environment (CAVE) device during experimentation. (B) Third-person perspective on the forest glade and the two bridge piers with the plank. (C) First-person perspective in the height context. (D) Wireframe with specifications of the area (red) for which the time was determined that the head was bent beside the plank. (E) The vertically deflecting physical plank onto which the virtual plank was projected. (F) First-person perspective in the ground context.
Fig. 2Environmental exploration and valence in the four scenarios. The graph shows the median in sensed environmental valence and the median of the proportion of the total exploration time, when the head was oriented below the horizon or towards and above the horizon in the four experimental scenarios.
Fig. 3Environment exploration with the head bent beside the plank and environment-related arousal in the four scenarios. The graph shows the median of the amount by which the environment was sensed as arousing and the median of the portion of the time spent on the plank with the head oriented below the horizon and bent beside the plank in the four scenarios.