Literature DB >> 27993674

Beyond the FFA: Brain-behavior correspondences in face recognition abilities.

Daniel B Elbich1, Suzanne Scherf2.   

Abstract

Despite the thousands of papers investigating the neural basis of face perception in both humans and non-human primates, very little is known about how activation within this neural architecture relates to face processing behavior. Here, we investigated individual differences in brain-behavior correspondences within both core and extended regions of the face-processing system in healthy typically developing adults. To do so, we employed a set of behavioral and neural measures to capture a multifaceted perspective on assessing these brain-behavior relations. This included quantifying face and object recognition behavior, the magnitude and size of functional activation within each region, as well as a measure of global activation across regions. We report that face, but not object, recognition behavior was associated with 1) the magnitude of face-selective activation in the left FFA1, 2) larger face-related regions in multiple bilateral face-patches in the fusiform gyri as well as the bilateral anterior temporal lobe and amygdala, and 3) more distributed global face-network activation. In contrast, face recognition behavior was not associated with any measure of object- or place-selective activation. These findings suggest that superior behavior is served by engaging sufficiently large, distributed patches of neural real estate, which might reflect the integration of independent populations of neurons that enables the formation of richer representations.
Copyright © 2016 Elsevier Inc. All rights reserved.

Entities:  

Keywords:  Amygdala; FFA; Face recognition; Fusiform gyrus; Individual differences; fMRI

Mesh:

Year:  2016        PMID: 27993674     DOI: 10.1016/j.neuroimage.2016.12.042

Source DB:  PubMed          Journal:  Neuroimage        ISSN: 1053-8119            Impact factor:   6.556


  9 in total

1.  Multifaceted Integration: Memory for Faces Is Subserved by Widespread Connections between Visual, Memory, Auditory, and Social Networks.

Authors:  Michal Ramot; Catherine Walsh; Alex Martin
Journal:  J Neurosci       Date:  2019-04-29       Impact factor: 6.167

Review 2.  Face Recognition.

Authors:  Steven Z Rapcsak
Journal:  Curr Neurol Neurosci Rep       Date:  2019-05-30       Impact factor: 5.081

3.  Evaluating the organizational structure and specificity of network topology within the face processing system.

Authors:  Daniel B Elbich; Peter C M Molenaar; K Suzanne Scherf
Journal:  Hum Brain Mapp       Date:  2019-02-18       Impact factor: 5.038

4.  Modular community structure of the face network supports face recognition.

Authors:  Gidon Levakov; Olaf Sporns; Galia Avidan
Journal:  Cereb Cortex       Date:  2022-09-04       Impact factor: 4.861

5.  Investigating the Influence of Biological Sex on the Behavioral and Neural Basis of Face Recognition.

Authors:  K Suzanne Scherf; Daniel B Elbich; Natalie V Motta-Mena
Journal:  eNeuro       Date:  2017-05-09

6.  Concrete versus abstract forms of social concept: an fMRI comparison of knowledge about people versus social terms.

Authors:  Grace E Rice; Paul Hoffman; Richard J Binney; Matthew A Lambon Ralph
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2018-08-05       Impact factor: 6.671

7.  Age-related increase of image-invariance in the fusiform face area.

Authors:  Marisa Nordt; Kilian Semmelmann; Erhan Genç; Sarah Weigelt
Journal:  Dev Cogn Neurosci       Date:  2018-04-22       Impact factor: 6.464

8.  Introducing the female Cambridge face memory test - long form (F-CFMT+).

Authors:  Myles Arrington; Daniel Elbich; Junqiang Dai; Bradley Duchaine; K Suzanne Scherf
Journal:  Behav Res Methods       Date:  2022-02-22

9.  A quantitative meta-analysis of face recognition deficits in autism: 40 years of research.

Authors:  Jason W Griffin; Russell Bauer; K Suzanne Scherf
Journal:  Psychol Bull       Date:  2020-10-26       Impact factor: 17.737

  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.