Literature DB >> 31238248

Hearing non-signers use their gestures to predict iconic form-meaning mappings at first exposure to signs.

Gerardo Ortega1, Annika Schiefner2, Aslı Özyürek3.   

Abstract

The sign languages of deaf communities and the gestures produced by hearing people are communicative systems that exploit the manual-visual modality as means of expression. Despite their striking differences they share the property of iconicity, understood as the direct relationship between a symbol and its referent. Here we investigate whether non-signing hearing adults exploit their implicit knowledge of gestures to bootstrap accurate understanding of the meaning of iconic signs they have never seen before. In Study 1 we show that for some concepts gestures exhibit systematic forms across participants, and share different degrees of form overlap with the signs for the same concepts (full, partial, and no overlap). In Study 2 we found that signs with stronger resemblance with signs are more accurately guessed and are assigned higher iconicity ratings by non-signers than signs with low overlap. In addition, when more people produced a systematic gesture resembling a sign, they assigned higher iconicity ratings to that sign. Furthermore, participants had a bias to assume that signs represent actions and not objects. The similarities between some signs and gestures could be explained by deaf signers and hearing gesturers sharing a conceptual substrate that is rooted in our embodied experiences with the world. The finding that gestural knowledge can ease the interpretation of the meaning of novel signs and predicts iconicity ratings is in line with embodied accounts of cognition and the influence of prior knowledge to acquire new schemas. Through these mechanisms we propose that iconic gestures that overlap in form with signs may serve as some type of 'manual cognates' that help non-signing adults to break into a new language at first exposure.
Copyright © 2019 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Form-meaning mappings; Gesture; Iconicity; Iconicity ratings; Manualmodality; Sign language

Year:  2019        PMID: 31238248     DOI: 10.1016/j.cognition.2019.06.008

Source DB:  PubMed          Journal:  Cognition        ISSN: 0010-0277


  6 in total

1.  L2M1 and L2M2 Acquisition of Sign Lexicon: The Impact of Multimodality on the Sign Second Language Acquisition.

Authors:  Krister Schönström; Ingela Holmström
Journal:  Front Psychol       Date:  2022-06-10

2.  People infer communicative action through an expectation for efficient communication.

Authors:  Amanda Royka; Annie Chen; Rosie Aboody; Tomas Huanca; Julian Jara-Ettinger
Journal:  Nat Commun       Date:  2022-07-18       Impact factor: 17.694

3.  Mapping Word to World in ASL: Evidence from a Human Simulation Paradigm.

Authors:  Allison Fitch; Sudha Arunachalam; Amy M Lieberman
Journal:  Cogn Sci       Date:  2021-12

4.  Breaking Into Language in a New Modality: The Role of Input and Individual Differences in Recognising Signs.

Authors:  Julia Elisabeth Hofweber; Lizzy Aumonier; Vikki Janke; Marianne Gullberg; Chloe Marshall
Journal:  Front Psychol       Date:  2022-05-18

5.  Visual form of ASL verb signs predicts non-signer judgment of transitivity.

Authors:  Chuck Bradley; Evie A Malaia; Jeffrey Mark Siskind; Ronnie B Wilbur
Journal:  PLoS One       Date:  2022-02-25       Impact factor: 3.240

6.  Gesture is the primary modality for language creation.

Authors:  Nicolas Fay; Bradley Walker; T Mark Ellison; Zachary Blundell; Naomi De Kleine; Murray Garde; Casey J Lister; Susan Goldin-Meadow
Journal:  Proc Biol Sci       Date:  2022-03-09       Impact factor: 5.349

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.