Literature DB >> 23132604

Reading your own lips: common-coding theory and visual speech perception.

Nancy Tye-Murray1, Brent P Spehar, Joel Myerson, Sandra Hale, Mitchell S Sommers.   

Abstract

Common-coding theory posits that (1) perceiving an action activates the same representations of motor plans that are activated by actually performing that action, and (2) because of individual differences in the ways that actions are performed, observing recordings of one's own previous behavior activates motor plans to an even greater degree than does observing someone else's behavior. We hypothesized that if observing oneself activates motor plans to a greater degree than does observing others, and if these activated plans contribute to perception, then people should be able to lipread silent video clips of their own previous utterances more accurately than they can lipread video clips of other talkers. As predicted, two groups of participants were able to lipread video clips of themselves, recorded more than two weeks earlier, significantly more accurately than video clips of others. These results suggest that visual input activates speech motor activity that links to word representations in the mental lexicon.

Entities:  

Mesh:

Year:  2013        PMID: 23132604      PMCID: PMC3558632          DOI: 10.3758/s13423-012-0328-5

Source DB:  PubMed          Journal:  Psychon Bull Rev        ISSN: 1069-9384


  20 in total

1.  Recognition of self-generated actions from kinematic displays of drawing.

Authors:  G Knoblich; W Prinz
Journal:  J Exp Psychol Hum Percept Perform       Date:  2001-04       Impact factor: 3.332

2.  New insights on sensorimotor integration: from hand action to speech perception.

Authors:  Luciano Fadiga; Laila Craighero
Journal:  Brain Cogn       Date:  2003-12       Impact factor: 2.310

Review 3.  Lipreading and audio-visual speech perception.

Authors:  Q Summerfield
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  1992-01-29       Impact factor: 6.237

Review 4.  The Theory of Event Coding (TEC): a framework for perception and action planning.

Authors:  B Hommel; J Müsseler; G Aschersleben; W Prinz
Journal:  Behav Brain Sci       Date:  2001-10       Impact factor: 12.579

5.  Perceiving action identity: how pianists recognize their own performances.

Authors:  Bruno H Repp; Günther Knoblich
Journal:  Psychol Sci       Date:  2004-09

6.  Motor cortex maps articulatory features of speech sounds.

Authors:  Friedemann Pulvermüller; Martina Huss; Ferath Kherif; Fermin Moscoso del Prado Martin; Olaf Hauk; Yury Shtyrov
Journal:  Proc Natl Acad Sci U S A       Date:  2006-05-08       Impact factor: 11.205

Review 7.  The motor theory of speech perception reviewed.

Authors:  Bruno Galantucci; Carol A Fowler; M T Turvey
Journal:  Psychon Bull Rev       Date:  2006-06

8.  The English Lexicon Project.

Authors:  David A Balota; Melvin J Yap; Michael J Cortese; Keith A Hutchison; Brett Kessler; Bjorn Loftis; James H Neely; Douglas L Nelson; Greg B Simpson; Rebecca Treiman
Journal:  Behav Res Methods       Date:  2007-08

9.  Authorship effects in the prediction of handwriting strokes: evidence for action simulation during action perception.

Authors:  Gunther Knoblich; Eva Seigerschmidt; Rüdiger Flach; Wolfgang Prinz
Journal:  Q J Exp Psychol A       Date:  2002-07

10.  Reading fluent speech from talking faces: typical brain networks and individual differences.

Authors:  Deborah A Hall; Clayton Fussell; A Quentin Summerfield
Journal:  J Cogn Neurosci       Date:  2005-06       Impact factor: 3.225

View more
  7 in total

1.  Electrophysiological evidence for a self-processing advantage during audiovisual speech integration.

Authors:  Avril Treille; Coriandre Vilain; Sonia Kandel; Marc Sato
Journal:  Exp Brain Res       Date:  2017-07-04       Impact factor: 1.972

Review 2.  Prediction and constraint in audiovisual speech perception.

Authors:  Jonathan E Peelle; Mitchell S Sommers
Journal:  Cortex       Date:  2015-03-20       Impact factor: 4.027

3.  The self-advantage in visual speech processing enhances audiovisual speech recognition in noise.

Authors:  Nancy Tye-Murray; Brent P Spehar; Joel Myerson; Sandra Hale; Mitchell S Sommers
Journal:  Psychon Bull Rev       Date:  2015-08

4.  Increased Connectivity among Sensory and Motor Regions during Visual and Audiovisual Speech Perception.

Authors:  Jonathan E Peelle; Brent Spehar; Michael S Jones; Sarah McConkey; Joel Myerson; Sandra Hale; Mitchell S Sommers; Nancy Tye-Murray
Journal:  J Neurosci       Date:  2021-11-23       Impact factor: 6.709

5.  Do We Perceive Others Better than Ourselves? A Perceptual Benefit for Noise-Vocoded Speech Produced by an Average Speaker.

Authors:  William L Schuerman; Antje Meyer; James M McQueen
Journal:  PLoS One       Date:  2015-07-02       Impact factor: 3.240

6.  Computer simulations of coupled idiosyncrasies in speech perception and speech production with COSMO, a perceptuo-motor Bayesian model of speech communication.

Authors:  Marie-Lou Barnaud; Jean-Luc Schwartz; Pierre Bessière; Julien Diard
Journal:  PLoS One       Date:  2019-01-11       Impact factor: 3.240

7.  The own-voice benefit for word recognition in early bilinguals.

Authors:  Sarah Cheung; Molly Babel
Journal:  Front Psychol       Date:  2022-09-02
  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.