Literature DB >> 30969894

Somatosensory interactions reveal feature-dependent computations.

M Shoaibur Rahman1, Jeffrey M Yau1.   

Abstract

Our ability to perceive and discriminate textures is based on the processing of high-frequency vibrations generated on the fingertip as it scans across a surface. Although much is known about the processing of vibration amplitude and frequency information when cutaneous stimulation is experienced at a single location on the body, how these stimulus features are processed when touch occurs at multiple locations is poorly understood. We evaluated participants' ability to discriminate tactile cues (100-300 Hz) on one hand while they ignored distractor cues experienced on their other hand. We manipulated the relative positions of the hands to characterize how limb position influenced cutaneous touch interactions. In separate experiments, participants judged either the frequency or intensity of mechanical vibrations. We found that vibrations experienced on one hand always systematically modulated the perception of vibrations on the other hand. Notably, bimanual interaction patterns and their sensitivity to hand locations differed according to stimulus feature. Somatosensory interactions in intensity perception were only marked by attenuation that was invariant to hand position manipulations. In contrast, interactions in frequency perception consisted of both bias and sensitivity changes that were more pronounced when the hands were held in close proximity. We implemented models to infer the neural computations that mediate somatosensory interactions in the intensity and frequency dimensions. Our findings reveal obligatory and feature-dependent somatosensory interactions that may be supported by both feature-specific and feature-general operations. NEW & NOTEWORTHY Little is known about the neural computations mediating feature-specific sensory interactions between the hands. We show that vibrations experienced on one hand systematically modulate the perception of vibrations felt on the other hand. Critically, interaction patterns and their dependence on the relative positions of the hands differed depending on whether participants judged vibration intensity or frequency. These results, which we recapitulate with models, imply that somatosensory interactions are mediated by feature-dependent neural computations.

Entities:  

Keywords:  bimanual interactions; integration; perception; touch; vibrotactile

Mesh:

Year:  2019        PMID: 30969894     DOI: 10.1152/jn.00168.2019

Source DB:  PubMed          Journal:  J Neurophysiol        ISSN: 0022-3077            Impact factor:   2.714


  5 in total

1.  Auditory and tactile frequency representations are co-embedded in modality-defined cortical sensory systems.

Authors:  Md Shoaibur Rahman; Kelly Anne Barnes; Lexi E Crommett; Mark Tommerdahl; Jeffrey M Yau
Journal:  Neuroimage       Date:  2020-04-11       Impact factor: 6.556

2.  Electro-Haptic Enhancement of Spatial Hearing in Cochlear Implant Users.

Authors:  Mark D Fletcher; Robyn O Cunningham; Sean R Mills
Journal:  Sci Rep       Date:  2020-01-31       Impact factor: 4.379

3.  Sensitivity to haptic sound-localisation cues.

Authors:  Mark D Fletcher; Jana Zgheib; Samuel W Perry
Journal:  Sci Rep       Date:  2021-01-11       Impact factor: 4.379

4.  Predictive attenuation of touch and tactile gating are distinct perceptual phenomena.

Authors:  Konstantina Kilteni; H Henrik Ehrsson
Journal:  iScience       Date:  2022-03-14

5.  Multidigit tactile perception I: motion integration benefits for tactile trajectories presented bimanually.

Authors:  Irena Arslanova; Shinya Takamuku; Hiroaki Gomi; Patrick Haggard
Journal:  J Neurophysiol       Date:  2022-07-13       Impact factor: 2.974

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.