| Literature DB >> 34415367 |
Alice Bollini1, Davide Esposito2,3, Claudio Campus2, Monica Gori2.
Abstract
The human brain creates an external world representation based on magnitude judgments by estimating distance, numerosity, or size. The magnitude and spatial representation are hypothesized to rely on common mechanisms shared by different sensory modalities. We explored the relationship between magnitude and spatial representation using two different sensory systems. We hypothesize that the interaction between space and magnitude is combined differently depending on sensory modalities. Furthermore, we aimed to understand the role of the spatial reference frame in magnitude representation. We used stimulus-response compatibility (SRC) to investigate these processes assuming that performance is improved if stimulus and response share common features. We designed an auditory and tactile SRC task with conflicting spatial and magnitude mapping. Our results showed that sensory modality modulates the relationship between space and magnitude. A larger effect of magnitude over spatial congruency occurred in a tactile task. However, magnitude and space showed similar weight in the auditory task, with neither spatial congruency nor magnitude congruency having a significant effect. Moreover, we observed that the spatial frame activated during tasks was elicited by the sensory inputs. The participants' performance was reversed in the tactile task between uncrossed and crossed hands posture, suggesting an internal coordinate system. In contrast, crossing the hands did not alter performance (i.e., using an allocentric frame of reference). Overall, these results suggest that space and magnitude interaction differ in auditory and tactile modalities, supporting the idea that these sensory modalities use different magnitude and spatial representation mechanisms.Entities:
Keywords: Audition; Frame of reference; Magnitude; Spatial representation; Touch
Mesh:
Year: 2021 PMID: 34415367 PMCID: PMC8536643 DOI: 10.1007/s00221-021-06196-4
Source DB: PubMed Journal: Exp Brain Res ISSN: 0014-4819 Impact factor: 1.972
Fig. 1Schematic representation of task setup and procedure. Panel A and B represent the time window of the trial in uncrossed-hands posture (top) and crossed-hands posture (bottom) of auditory (A) and tactile (B) tasks. Panel C represents the schema of our experimental conditions in the case of high-frequency stimulus in the two groups, the black arrow represents stimulus position, black response key, the correct response key
Fig. 2Results of MAGNITUDE-aligned and MAGNITUDE-misaligned groups in auditory and tactile stimulus–response tasks. Left panel: LISAS scores for the MAGNITUDE-aligned (MA) group. Right panel: LISAS scores for the MAGNITUDE-misaligned (MM) group. Error bars represent the standard error of the mean (SEM); gray points single-subject values
Fig. 3Δ-Spatial (spatial-incongruent MINUS spatial-congruent) for MAGNITUDE-aligned and MAGNITUDE-misaligned groups. The first panel on the left represents results for the auditory task. The panel on the right represents results for the tactile task. Error bars represent the standard error of the mean (SEM), gray points single-subject performance,* indicates pbonf<0.05, *** indicates pbonf <0.001, ns indicates not significant result.