| Literature DB >> 24315926 |
Irit Sella1, Miriam Reiner2, Hillel Pratt3.
Abstract
Cues that involve a number of sensory modalities are processed in the brain in an interactive multimodal manner rather than independently for each modality. We studied multimodal integration in a natural, yet fully controlled scene, implemented as an interactive game in an auditory-haptic-visual virtual environment. In this imitation of a natural scene, the targets of perception were ecologically valid uni-, bi- and tri-modal manifestations of a simple event-a ball hitting a wall. Subjects were engaged in the game while their behavioral and early cortical electrophysiological responses were measured. Behavioral results confirmed that tri-modal cues were detected faster and more accurately than bi-modal cues, which, likewise, showed advantages over unimodal responses. Event-Related Potentials (ERPs) were recorded, and the first 200 ms following stimulus onset was analyzed to reveal the latencies of cortical multimodal interactions as estimated by sLORETA. These electrophysiological findings indicated bi-modal as well as tri-modal interactions beginning very early (~30 ms), uniquely for each multimodal combination. The results suggest that early cortical multimodal integration accelerates cortical activity and, in turn, enhances performance measures. This acceleration registers on the scalp as sub-additive cortical activation.Entities:
Keywords: Audition; Event-Related Potentials; Evoked potentials; Haptics; Human sensory integration; Source estimations; Virtual reality; Vision
Mesh:
Year: 2013 PMID: 24315926 DOI: 10.1016/j.ijpsycho.2013.11.003
Source DB: PubMed Journal: Int J Psychophysiol ISSN: 0167-8760 Impact factor: 2.997