| Literature DB >> 34159525 |
A Zanini1,2, I Patané3,4, E Blini3,4,5, R Salemme3,4,6, E Koun3,4,6, A Farnè3,4,6,7, C Brozzoli8,9,10,11.
Abstract
Peripersonal space (PPS) is a multisensory representation of the space near body parts facilitating interactions with the close environment. Studies on non-human and human primates agree in showing that PPS is a body part-centered representation that guides actions. Because of these characteristics, growing confusion surrounds peripersonal and arm-reaching space (ARS), that is the space one's arm can reach. Despite neuroanatomical evidence favoring their distinction, no study has contrasted directly their respective extent and behavioral features. Here, in five experiments (N = 140) we found that PPS differs from ARS, as evidenced both by participants' spatial and temporal performance and by its modeling. We mapped PPS and ARS using both their respective gold standard tasks and a novel multisensory facilitation paradigm. Results show that: (1) PPS is smaller than ARS; (2) multivariate analyses of spatial patterns of multisensory facilitation predict participants' hand locations within ARS; and (3) the multisensory facilitation map shifts isomorphically following hand positions, revealing hand-centered coding of PPS, therefore pointing to a functional similarity to the receptive fields of monkeys' multisensory neurons. A control experiment further corroborated these results and additionally ruled out the orienting of attention as the driving mechanism for the increased multisensory facilitation near the hand. In sharp contrast, ARS mapping results in a larger spatial extent, with undistinguishable patterns across hand positions, cross-validating the conclusion that PPS and ARS are distinct spatial representations. These findings show a need for refinement of theoretical models of PPS, which is relevant to constructs as diverse as self-representation, social interpersonal distance, and motor control.Entities:
Keywords: Hand-centered space; Multisensory; Perception; Peripersonal space; Reaching space
Mesh:
Year: 2021 PMID: 34159525 PMCID: PMC8642341 DOI: 10.3758/s13423-021-01942-9
Source DB: PubMed Journal: Psychon Bull Rev ISSN: 1069-9384
Fig. 1Experimental setup across experiments. a Positions of right hand, fixation cross, and visual stimuli. b and c The close hand (b) and the distant hand condition (c). In both experiments, the visual stimuli (here displayed as gray circles) were projected one at a time, in one of the ten possible positions (from V-P1 to V-P10), corrected for retinal size (a–c). Tactile and visual stimuli were presented alone (unisensory) or coupled synchronously with each other (multisensory). Globally, we adopted two conditions of unisensory stimulation (only tactile or visual stimulation) and a multisensory condition (visuo-tactile stimulation). To these, we added catch trials (nor visual nor tactile stimuli presented) to monitor participant’s compliance
Formulas adopted to fit the curves for the multisensory gain values in Experiment 1. X represents one of the ten experimental positions (from V-P1 to V-P10). We used the same formulas to fit the sigmoidal and normal curves to reachability judgments in Experiment 2
| Sigmoidal | Normal |
|---|---|
Fig. 2Different patterns of hand-centered multisensory facilitation within ARS. a Multisensory gain (MG) values along the ten visual positions, ranging from near to far space, for the distant (yellow) and the close (green) hand conditions. Higher values of MG represent stronger facilitation in terms of RT in the multisensory condition than in the unisensory tactile baseline (by definition, MG = 0). Error bars represent the standard error of the mean. Asterisks represent a significant difference (p < 0.05, corrected). b and c Number of trials reporting MG values greater than zero (unisensory tactile baseline) along the ten visual positions, ranging from near to far space, for the close (b) and the distant (c) hand conditions
Fig. 3The spatial pattern of MG shifts and follows the hand within reaching space. Cross-correlation analysis of distally shifting the pattern of MG values for all reachable positions with the hand close. Red colors represent higher MG values. Values of Pearson’s r and p values are reported for all the correlations performed. The black grid highlights the only significant correlation (p < 0.05)
Fig. 4No hand-centered MG spatial patterns in a reachability judgment task. a Multisensory gain (MG) values along the ten positions, ranging from near to far space, for the close (green) and distant (yellow) hand conditions. Higher values of MG represent a stronger facilitation in terms of RT with respect to the unimodal visual baseline (by definition, MG = 0). Error bars represent the standard error of the mean. No significant differences between hand postures emerged. b PSE values calculated for both unimodal visual and multisensory visuo-tactile conditions for both hands. Error bars represent the standard error of the mean. Asterisks indicate a significant difference between unisensory and multisensory conditions (p < 0.05)