| Literature DB >> 23155444 |
Alessia Tessari1, Giovanni Ottoboni, Andrea Mazzatenta, Arcangelo Merla, Roberto Nicoletti.
Abstract
Facial emotions and emotional body postures can easily grab attention in social communication. In the context of faces, gaze has been shown as an important cue for orienting attention, but less is known for other important body parts such as hands. In the present study we investigated whether hands may orient attention due to the emotional features they convey. By implying motion in static photographs of hands, we aimed at furnishing observers with information about the intention to act and at testing if this interacted with the hand automatic coding. In this study, we compared neutral and frontal hands to emotionally threatening hands, rotated along their radial-ulnar axes in a Sidedness task (a Simon-like task based on automatic access to body representation). Results showed a Sidedness effect for both the palm and the back views with either neutral and emotional hands. More important, no difference was found between the two views for neutral hands, but it emerged in the case of the emotional hands: faster reaction times were found for the palm than the back view. The difference was ascribed to palm views' "offensive" pose: a source of threat that might have raised participants' arousal. This hypothesis was also supported by conscious evaluations of the dimensions of valence (pleasant-unpleasant) and arousal. Results are discussed in light of emotional feature coding.Entities:
Mesh:
Year: 2012 PMID: 23155444 PMCID: PMC3498372 DOI: 10.1371/journal.pone.0049011
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Figure 1The figure provides a visual representation of the concept of “Sidedness”.
A hand always generates a spatial code based on the side it is imagined with respect to the body of reference. The case here reported shows a left hand shifts from a “right sidedness” to a “left sidedness” (we thank Rory O'Sullivan for the Blender UniHuman character model used in this work. UniHuman is available at http://unihuman.yolasite.com/).
Figure 2Hand stimuli used in Experiments 1a (on the left) and 1b (on the right): Examples for both the palm and the back views are shown.
Mean RTs (in ms) and ERs as a function of conditions (hand views) and corresponding and non-corresponding pairings for Condition a and b.
| Condition a | Condition b | |||
|
| Back view | Palm view | Back view | Palm view |
|
|
| |||
| Corresponding | 336 (41) | 333 (37) | 324 (20) | 330 (23) |
| Non-corresponding | 342 (40) | 328 (35) | 346 (30) | 325 (20) |
|
| ||||
| Corresponding | 4.4 (4.5) | 5.0 (3.9) | 1.9 (2.2) | 4.3 (2.6) |
| Non-corresponding | 3.3 (2.7) | 3.7 (3.5) | 7.3 (4.6) | 2.8 (2.0) |
Standard deviations are reported in brackets.