Literature DB >> 27504680

Is it all task-specific? The role of binary responses, verbal mediation, and saliency for eliciting language-space associations.

Carolin Dudschig1, Barbara Kaup1.   

Abstract

Associations between language and space are of central interest for grounded models of language comprehension. Various studies show that reading words such as bird or shoe results in faster responses toward the typical location of the corresponding entity (e.g., after bird, upward responses are faster than downward responses). Critically, as of yet, the mechanisms underlying these effects and their boundary conditions are widely unknown. In fact, it cannot be ruled out that these effects are by-products of processing that only occur in very specific task settings. Here we investigated the role of 3 major processes (response set, labeling, and saliency) that might underlie these compatibility effects in Stroop-like paradigms. In Experiment 1, we aimed at overcoming the binary nature of the response set by including responses both in the vertical and the horizontal dimension. In Experiment 2 no memorizing of the color-to-response mapping was required, making internal response labeling unnecessary. In Experiment 3 this was replicated in a mouse-tracking setup. In all experiments a clear language-space association was observed. Critically, in a final experiment not only the saliency of verticality in the response set but also in the stimulus set was reduced. Here no language-space association was observed. Together these results suggest that language-space associations extend beyond bipolar response settings and that internal response labeling is not a precondition for finding these compatibility effects. However, the vertical dimension needs to be salient either in the stimulus or response set. Implications for models of language comprehension are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

Entities:  

Mesh:

Year:  2016        PMID: 27504680     DOI: 10.1037/xlm0000297

Source DB:  PubMed          Journal:  J Exp Psychol Learn Mem Cogn        ISSN: 0278-7393            Impact factor:   3.051


  7 in total

1.  Do I need to have my hands free to understand hand-related language? Investigating the functional relevance of experiential simulations.

Authors:  Jessica Vanessa Strozyk; Carolin Dudschig; Barbara Kaup
Journal:  Psychol Res       Date:  2017-08-02

2.  Reading sentences describing high- or low-pitched auditory events: only pianists show evidence for a horizontal space-pitch association.

Authors:  Sibylla Wolter; Carolin Dudschig; Barbara Kaup
Journal:  Psychol Res       Date:  2016-10-12

3.  Green as a cbemcuru: modal as well as amodal color cues can help to solve anagrams.

Authors:  Eduard Berndt; Carolin Dudschig; Barbara Kaup
Journal:  Psychol Res       Date:  2018-07-11

4.  The Sounds of Sentences: Differentiating the Influence of Physical Sound, Sound Imagery, and Linguistically Implied Sounds on Physical Sound Processing.

Authors:  Carolin Dudschig; Ian Grant Mackenzie; Jessica Strozyk; Barbara Kaup; Hartmut Leuthold
Journal:  Cogn Affect Behav Neurosci       Date:  2016-10       Impact factor: 3.282

5.  How do German bilingual schoolchildren process German prepositions? - A study on language-motor interactions.

Authors:  Daniela Katharina Ahlberg; Heike Bischoff; Jessica Vanessa Strozyk; Doreen Bryant; Barbara Kaup
Journal:  PLoS One       Date:  2018-03-14       Impact factor: 3.240

6.  Replacing vertical actions by mouse movements: a web-suited paradigm for investigating vertical spatial associations.

Authors:  Emanuel Schütt; Ian Grant Mackenzie; Barbara Kaup; Carolin Dudschig
Journal:  Psychol Res       Date:  2022-02-07

7.  The limits of automatic sensorimotor processing during word processing: investigations with repeated linguistic experience, memory consolidation during sleep, and rich linguistic learning contexts.

Authors:  Fritz Günther; Sophia Antonia Press; Carolin Dudschig; Barbara Kaup
Journal:  Psychol Res       Date:  2021-12-01
  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.