Literature DB >> 19093105

Integration of auditory and visual information in the recognition of realistic objects.

Clara Suied1, Nicolas Bonneel, Isabelle Viaud-Delmon.   

Abstract

Recognizing a natural object requires one to pool information from various sensory modalities, and to ignore information from competing objects. That the same semantic knowledge can be accessed through different modalities makes it possible to explore the retrieval of supramodal object concepts. Here, object-recognition processes were investigated by manipulating the relationships between sensory modalities, specifically, semantic content, and spatial alignment between auditory and visual information. Experiments were run under realistic virtual environment. Participants were asked to react as fast as possible to a target object presented in the visual and/or the auditory modality and to inhibit a distractor object (go/no-go task). Spatial alignment had no effect on object-recognition time. The only spatial effect observed was a stimulus-response compatibility between the auditory stimulus and the hand position. Reaction times were significantly shorter for semantically congruent bimodal stimuli than would be predicted by independent processing of information about the auditory and visual targets. Interestingly, this bimodal facilitation effect was twice as large as found in previous studies that also used information-rich stimuli. An interference effect was observed (i.e. longer reaction times to semantically incongruent stimuli than to the corresponding unimodal stimulus) only when the distractor was auditory. When the distractor was visual, the semantic incongruence did not interfere with object recognition. Our results show that immersive displays with large visual stimuli may provide large multimodal integration effects, and reveal a possible asymmetry in the attentional filtering of irrelevant auditory and visual information.

Entities:  

Mesh:

Year:  2008        PMID: 19093105     DOI: 10.1007/s00221-008-1672-6

Source DB:  PubMed          Journal:  Exp Brain Res        ISSN: 0014-4819            Impact factor:   1.972


  42 in total

1.  Reaction time as a measure of intersensory facilitation.

Authors:  M HERSHENSON
Journal:  J Exp Psychol       Date:  1962-03

2.  Semantic congruence is a critical factor in multisensory behavioral performance.

Authors:  Paul J Laurienti; Robert A Kraft; Joseph A Maldjian; Jonathan H Burdette; Mark T Wallace
Journal:  Exp Brain Res       Date:  2004-06-18       Impact factor: 1.972

3.  Multisensory processing in the redundant-target effect: a behavioral and event-related potential study.

Authors:  Matthias Gondan; Birgit Niederhaus; Frank Rösler; Brigitte Röder
Journal:  Percept Psychophys       Date:  2005-05

4.  What you see is not (always) what you hear: induced gamma band responses reflect cross-modal interactions in familiar object recognition.

Authors:  Shlomit Yuval-Greenberg; Leon Y Deouell
Journal:  J Neurosci       Date:  2007-01-31       Impact factor: 6.167

5.  An interactive race model of divided attention.

Authors:  J T Mordkoff; S Yantis
Journal:  J Exp Psychol Hum Percept Perform       Date:  1991-05       Impact factor: 3.332

6.  Crossmodal identification.

Authors:  G A Calvert; M J Brammer; S D Iversen
Journal:  Trends Cogn Sci       Date:  1998-07-01       Impact factor: 20.229

7.  Dependence of target redundancy effects on noise conditions and number of targets.

Authors:  G R Grice; J W Gwynne
Journal:  Percept Psychophys       Date:  1987-07

8.  A computational model of the Simon effect.

Authors:  M Zorzi; C Umiltà
Journal:  Psychol Res       Date:  1995

9.  Interactions between exogenous auditory and visual spatial attention.

Authors:  M Schmitt; A Postma; E De Haan
Journal:  Q J Exp Psychol A       Date:  2000-02

Review 10.  Multisensory integration: space, time and superadditivity.

Authors:  Nicholas P Holmes; Charles Spence
Journal:  Curr Biol       Date:  2005-09-20       Impact factor: 10.834

View more
  15 in total

1.  Simultaneous perception of a spoken and a signed language: The brain basis of ASL-English code-blends.

Authors:  Jill Weisberg; Stephen McCullough; Karen Emmorey
Journal:  Brain Lang       Date:  2015-07-10       Impact factor: 2.381

2.  Semantic incongruity influences response caution in audio-visual integration.

Authors:  Benjamin Steinweg; Fred W Mast
Journal:  Exp Brain Res       Date:  2016-10-12       Impact factor: 1.972

3.  Multisensory aversive stimuli differentially modulate negative feelings in near and far space.

Authors:  Marine Taffou; Jan Ondřej; Carol O'Sullivan; Olivier Warusfel; Stéphanie Dubal; Isabelle Viaud-Delmon
Journal:  Psychol Res       Date:  2016-05-05

4.  Semantically congruent audiovisual integration with modal-based attention accelerates auditory short-term memory retrieval.

Authors:  Hongtao Yu; Aijun Wang; Ming Zhang; JiaJia Yang; Satoshi Takahashi; Yoshimichi Ejima; Jinglong Wu
Journal:  Atten Percept Psychophys       Date:  2022-05-31       Impact factor: 2.199

5.  Auditory enhancement of visual searches for event scenes.

Authors:  Tomoki Maezawa; Miho Kiyosawa; Jun I Kawahara
Journal:  Atten Percept Psychophys       Date:  2022-01-10       Impact factor: 2.199

6.  Auditory scene analysis: the sweet music of ambiguity.

Authors:  Daniel Pressnitzer; Clara Suied; Shihab A Shamma
Journal:  Front Hum Neurosci       Date:  2011-12-14       Impact factor: 3.169

7.  Target categorization with primes that vary in both congruency and sense modality.

Authors:  Kathryn Weatherford; Michael Mills; Anne M Porter; Paula Goolkasian
Journal:  Front Psychol       Date:  2015-01-23

8.  Resources required for processing ambiguous complex features in vision and audition are modality specific.

Authors:  Morgan D Barense; Jonathan Erez; Henry Ma; Rhodri Cusack
Journal:  Cogn Affect Behav Neurosci       Date:  2014-03       Impact factor: 3.526

9.  Auditory-visual object recognition time suggests specific processing for animal sounds.

Authors:  Clara Suied; Isabelle Viaud-Delmon
Journal:  PLoS One       Date:  2009-04-22       Impact factor: 3.240

10.  Neural Correlates of Early Sound Encoding and their Relationship to Speech-in-Noise Perception.

Authors:  Emily B J Coffey; Alexander M P Chepesiuk; Sibylle C Herholz; Sylvain Baillet; Robert J Zatorre
Journal:  Front Neurosci       Date:  2017-08-25       Impact factor: 4.677

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.