Literature DB >> 15722188

Intramodal perceptual grouping modulates multisensory integration: evidence from the crossmodal dynamic capture task.

Daniel Sanabria1, Salvador Soto-Faraco, Jason Chan, Charles Spence.   

Abstract

We investigated the extent to which intramodal visual perceptual grouping influences the multisensory integration (or grouping) of auditory and visual motion information. Participants discriminated the direction of motion of two sequentially presented sounds (moving leftward or rightward), while simultaneously trying to ignore a task-irrelevant visual apparent motion stream. The principles of perceptual grouping were used to vary the direction and extent of apparent motion within the irrelevant modality (vision). The results demonstrate that the multisensory integration of motion information can be modulated by the perceptual grouping taking place unimodally within vision, suggesting that unimodal perceptual grouping processes precede multisensory integration. The present study therefore illustrates how intramodal and crossmodal perceptual grouping processes interact to determine how the information in complex multisensory environments is parsed.

Entities:  

Mesh:

Year:  2004        PMID: 15722188     DOI: 10.1016/j.neulet.2004.11.069

Source DB:  PubMed          Journal:  Neurosci Lett        ISSN: 0304-3940            Impact factor:   3.046


  13 in total

1.  When does visual perceptual grouping affect multisensory integration?

Authors:  Daniel Sanabria; Salvador Soto-Faraco; Jason S Chan; Charles Spence
Journal:  Cogn Affect Behav Neurosci       Date:  2004-06       Impact factor: 3.282

2.  Assessing the effect of visual and tactile distractors on the perception of auditory apparent motion.

Authors:  Daniel Sanabria; Salvador Soto-Faraco; Charles Spence
Journal:  Exp Brain Res       Date:  2005-08-26       Impact factor: 1.972

3.  The modulation of crossmodal integration by unimodal perceptual grouping: a visuotactile apparent motion study.

Authors:  Georgina Lyons; Daniel Sanabria; Argiro Vatakis; Charles Spence
Journal:  Exp Brain Res       Date:  2006-05-23       Impact factor: 1.972

4.  Computing an optimal time window of audiovisual integration in focused attention tasks: illustrated by studies on effect of age and prior knowledge.

Authors:  Hans Colonius; Adele Diederich
Journal:  Exp Brain Res       Date:  2011-05-31       Impact factor: 1.972

5.  Segmentation, grouping and accentuation during stimulus perception.

Authors:  E N Sokolov; N I Nezlina
Journal:  Neurosci Behav Physiol       Date:  2010-02-11

6.  Spatiotemporal interactions between audition and touch depend on hand posture.

Authors:  Daniel Sanabria; Salvador Soto-Faraco; Charles Spence
Journal:  Exp Brain Res       Date:  2005-06-08       Impact factor: 1.972

7.  Visual search for a target changing in synchrony with an auditory signal.

Authors:  Waka Fujisaki; Ansgar Koene; Derek Arnold; Alan Johnston; Shin'ya Nishida
Journal:  Proc Biol Sci       Date:  2006-04-07       Impact factor: 5.349

8.  Audio-visual speech timing sensitivity is enhanced in cluttered conditions.

Authors:  Warrick Roseboom; Shin'ya Nishida; Waka Fujisaki; Derek H Arnold
Journal:  PLoS One       Date:  2011-04-06       Impact factor: 3.240

9.  Audiovisual integration of speech in a bistable illusion.

Authors:  K G Munhall; M W ten Hove; M Brammer; M Paré
Journal:  Curr Biol       Date:  2009-04-02       Impact factor: 10.834

10.  The role of spatiotemporal and spectral cues in segregating short sound events: evidence from auditory Ternus display.

Authors:  Qingcui Wang; Ming Bao; Lihan Chen
Journal:  Exp Brain Res       Date:  2013-10-20       Impact factor: 1.972

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.