| Literature DB >> 32457153 |
Arvid Guterstam1, Andrew I Wilterson2, Davis Wachtell2, Michael S A Graziano2.
Abstract
Keeping track of other people's gaze is an essential task in social cognition and key for successfully reading other people's intentions and beliefs (theory of mind). Recent behavioral evidence suggests that we construct an implicit model of other people's gaze, which may incorporate physically incoherent attributes such as a construct of force-carrying beams that emanate from the eyes. Here, we used functional magnetic resonance imaging and multivoxel pattern analysis to test the prediction that the brain encodes gaze as implied motion streaming from an agent toward a gazed-upon object. We found that a classifier, trained to discriminate the direction of visual motion, significantly decoded the gaze direction in static images depicting a sighted face, but not a blindfolded one, from brain activity patterns in the human motion-sensitive middle temporal complex (MT+) and temporo-parietal junction (TPJ). Our results demonstrate a link between the visual motion system and social brain mechanisms, in which the TPJ, a key node in theory of mind, works in concert with MT+ to encode gaze as implied motion. This model may be a fundamental aspect of social cognition that allows us to efficiently connect agents with the objects of their attention. It is as if the brain draws a quick visual sketch with moving arrows to help keep track of who is attending to what. This implicit, fluid-flow model of other people's gaze may help explain culturally universal myths about the mind as an energy-like, flowing essence.Entities:
Keywords: gaze; motion perception; social cognition; theory of mind; visual attention
Year: 2020 PMID: 32457153 PMCID: PMC7293620 DOI: 10.1073/pnas.2003110117
Source DB: PubMed Journal: Proc Natl Acad Sci U S A ISSN: 0027-8424 Impact factor: 11.205
Fig. 1.Methods. (A) Schematic time line of the fMRI design. While subjects continuously fixated on a central spot, they were exposed to 1.5-s-long trials of either a random dot motion stimulus (going left or right), or a static image of a face gazing at a tree (facing left or right), or an image of a blindfolded face (facing left or right). In a catch trial condition, one of the image elements (head or tree) or the moving dots appeared bright green, in response to which the subjects pressed a button. Arrows shown here indicate dot motion directions and were not part of the actual stimuli. There were equal numbers of rightward and leftward facing face-and-tree trials, but only rightward facing images are shown here. (B) To test our hypothesis that gaze is encoded as implied motion in motion-sensitive and social brain areas, we used a locally multivariate (Searchlight), leave-one-run-out, cross-classification approach, using the runwise regression (beta) coefficients as model input. We trained a classifier to discriminate the BOLD activity patterns associated with visual motion going left versus right in 19 runs and then tested whether it could decode activity patterns associated with gaze direction (eyes open facing left versus right) significantly better than in the blindfolded condition (eyes covered facing left versus right) in the left-out twentieth run, repeated for all runs.
Fig. 2.Results. Brain areas in which a classifier, trained on discriminating the direction of dot motion, significantly decoded the direction of gaze in static images of a face looking at an object (eyes open), using a blindfolded face as control (eyes covered). These results suggest that gaze is encoded as implied motion, in a specific direction, in the motion-sensitive middle temporal cortical complex (MT+, outlined in red) on the right side (A), and in the temporo-parietal junction (TPJ, red circles) bilaterally (B and C). Errors bars show SE, significance shown by *P < 0.05 and ***P < 0.001, corrected for multiple comparisons. See text for statistical details. The decoding maps are thresholded at P < 0.001 (uncorrected), for visualization purposes. pSTS, posterior superior temporal sulcus.
Decoding results
| Anatomical region | MNI | Peak T | Cluster size | |
| Temporal lobe | ||||
| R. posterior STS (TPJ) | 56, −54, 24 | 5.82 | <0.001 | 201 |
| R. parieto-temporo-occipital cortex (MT+) | 46, −70, 0 | 3.64 | 0.027 | 4 |
| R. fusiform gyrus | 22, −46, −16 | 3.62 | — | 11 |
| R. superior temporal gyrus | 72, −28, 6 | 3.71 | — | 5 |
| Parietal lobe | ||||
| R. supramarginal gyrus | 56, −40, 36 | 4.18 | — | 14 |
| R. supramarginal gyrus | 58, −38, 44 | 3.64 | — | 4 |
| L. angular gyrus (TPJ) | −58, −62, 26 | 4.03 | 0.019 | 10 |
| Frontal lobe | ||||
| R. precentral gyrus (premotor cortex) | 58, −4, 42 | 4.35 | — | 32 |
| L. inferior frontal gyrus | −46, 10, 20 | 3.62 | — | 7 |
| Insular cortex | ||||
| L. midinsula | −36, −2, 16 | 4.45 | — | 20 |
| Subcortical structures | ||||
| R. ventral striatum | 10, 4, −10 | 4.79 | — | 25 |
| L. putamen | −20, 6, 4 | 3.90 | — | 13 |
All brain regions (peaks) in which a classifier, trained on discriminating dot motion direction, decoded gaze direction at a threshold of P < 0.001, uncorrected for multiple comparisons, better in the eyes open than in the eyes covered condition. All listed regions also decoded gaze direction in the eyes open condition significantly (P < 0.05, uncorrected) better than chance (50%). FWE rate-corrected (corr) P values are reported for regions that survived the correction for multiple comparisons in our predefined ROIs (small-volume correction), consisting of the activation cluster from the MT+ visual motion localizer (), or 10-mm-radius spheres around the TPJ activation peaks in a previous fMRI study on theory of mind (11). The right TPJ peak in the posterior STS also survived correction for multiple comparisons using the whole brain as search space (P = 0.040). L., left; R., right.