Literature DB >> 8920841

Perceiving speech from inverted faces.

D W Massaro1, M M Cohen.   

Abstract

We examined whether the orientation of the face influences speech perception in face-to-face communication. Participants identified auditory syllables, visible syllables, and bimodal syllables presented in an expanded factorial design. The syllables were /ba/, /va/, /oa/, or /da/. The auditory syllables were taken from natural speech whereas the visible syllables were produced by computer animation of a realistic talking face. The animated face was presented either as viewed in normal upright orientation or inverted orientation (180 degrees frontal rotation). The central intent of the study was to determine if an inverted view of the face would change the nature of processing bimodal speech or simply influence the information available in visible speech. The results with both the upright and inverted face views were adequately described by the fuzzy logical model of perception (FLMP). The observed differences in the FLMP's parameter values corresponding to the visual information indicate that inverting the view of the face influences the amount of visible information but does not change the nature of the information processing in bimodal speech perception.

Entities:  

Mesh:

Year:  1996        PMID: 8920841     DOI: 10.3758/bf03206832

Source DB:  PubMed          Journal:  Percept Psychophys        ISSN: 0031-5117


  19 in total

1.  Selectivity, scope, and simplicity of models: a lesson from fitting judgments of perceived depth.

Authors:  J E Cutting; N Bruno; N P Brady; C Moore
Journal:  J Exp Psychol Gen       Date:  1992-09

Review 2.  The neuropsychology of lipreading.

Authors:  R Campbell
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  1992-01-29       Impact factor: 6.237

3.  Categorical perception of facial expressions.

Authors:  N L Etcoff; J J Magee
Journal:  Cognition       Date:  1992-09

4.  Hearing lips and seeing voices.

Authors:  H McGurk; J MacDonald
Journal:  Nature       Date:  1976 Dec 23-30       Impact factor: 49.962

5.  Integrating speech information across talkers, gender, and sensory modality: female faces and male voices in the McGurk effect.

Authors:  K P Green; P K Kuhl; A N Meltzoff; E B Stevens
Journal:  Percept Psychophys       Date:  1991-12

6.  Crossmodal integration in the identification of consonant segments.

Authors:  L D Braida
Journal:  Q J Exp Psychol A       Date:  1991-08

7.  Before you see it, you see its parts: evidence for feature encoding and integration in preschool children and adults.

Authors:  L A Thompson; D W Massaro
Journal:  Cogn Psychol       Date:  1989-07       Impact factor: 3.468

8.  Face recognition: a general or specific right hemisphere capacity?

Authors:  S C Levine; M T Banich; M P Koch-Weser
Journal:  Brain Cogn       Date:  1988-12       Impact factor: 2.310

9.  Minimodularity and the perception of layout.

Authors:  N Bruno; J E Cutting
Journal:  J Exp Psychol Gen       Date:  1988-06

10.  Cross-linguistic comparisons in the integration of visual and auditory speech.

Authors:  D W Massaro; M M Cohen; P M Smeele
Journal:  Mem Cognit       Date:  1995-01
View more
  3 in total

1.  Fuzzy logic: A "simple" solution for complexities in neurosciences?

Authors:  Saniya Siraj Godil; Muhammad Shahzad Shamim; Syed Ather Enam; Uvais Qidwai
Journal:  Surg Neurol Int       Date:  2011-02-26

2.  Visual speech discrimination and identification of natural and synthetic consonant stimuli.

Authors:  Benjamin T Files; Bosco S Tjan; Jintao Jiang; Lynne E Bernstein
Journal:  Front Psychol       Date:  2015-07-13

3.  Spatio-temporal distribution of brain activity associated with audio-visually congruent and incongruent speech and the McGurk Effect.

Authors:  Hillel Pratt; Naomi Bleich; Nomi Mittelman
Journal:  Brain Behav       Date:  2015-10-15       Impact factor: 2.708

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.