| Literature DB >> 31082774 |
Liana C L Portugal1, Jessica Schrouff2, Ricki Stiffler3, Michele Bertocci3, Genna Bebko3, Henry Chase3, Jeanette Lockovitch3, Haris Aslam3, Simona Graur3, Tsafrir Greenberg3, Mirtes Pereira4, Leticia Oliveira4, Mary Phillips5, Janaina Mourão-Miranda2.
Abstract
BACKGROUND: It is becoming increasingly clear that pathophysiological processes underlying psychiatric disorders categories are heterogeneous on many levels, including symptoms, disease course, comorbidity and biological underpinnings. This heterogeneity poses challenges for identifying biological markers associated with dimensions of symptoms and behaviour that could provide targets to guide treatment choice and novel treatment. In response, the research domain criteria (RDoC) (Insel et al., 2010) was developed to advocate a dimensional approach which omits any disease definitions, disorder thresholds, or cut-points for various levels of psychopathology to understanding the pathophysiological processes underlying psychiatry disorders. In the present study we aimed to apply pattern regression analysis to identify brain signatures during dynamic emotional face processing that are predictive of anxiety and depression symptoms in a continuum that ranges from normal to pathological levels, cutting across categorically-defined diagnoses.Entities:
Keywords: Anxiety; Depression; Faces; MVPA; Machine learning; Pattern recognition; Pattern regression analysis; RDoC; fMRI
Year: 2019 PMID: 31082774 PMCID: PMC6517640 DOI: 10.1016/j.nicl.2019.101813
Source DB: PubMed Journal: Neuroimage Clin ISSN: 2213-1582 Impact factor: 4.881
Mean and Standard Deviation of measures from whole sample, healthy sample and distressed sample.
| Measures | Whole sample | Distressed sample | Healthy sample |
|---|---|---|---|
| STAI-T | 43.6 (15.0) | 54.9 (11.0) | 30.6 (5.7) |
| STAI-S | 39.0 (13.2) | 48.0 (10.8) | 28.7 (6.0) |
| MASQ-D | 25.8 (13.4) | 35.0 (12.0) | 15.3 (3.5) |
| HDRS | 8.4 (8.6) | 15.0 (6.0) | 0.8 (1.5) |
| HAM-A | 6.8 (7.4) | 12.2 (6.3) | 0.6 (1.2) |
Measures of agreement between actual and decoded scores based on wholebrain activity patterns to emotional faces after controlling for covariate (age) in the whole sample. Significant results are displayed in red.
For reference: corrected p-value=0.005.
Fig. 1(A) Scatter plot between the actual and predicted STAI-T scores for the model based on patterns of brain activation to dynamic emotional face processing in the whole sample applying a two-folds cross-validation scheme. For visualization purposed the subjects were colour coded as belonging to the healthy and distressed samples. (B) Same plot as in (A) but for visualization purposed the subjects were colour coded according to the categorically-defined diagnoses. (C) Scatter plot between the actual and predicted STAI-T scores for the model based on patterns of brain activation to dynamic emotional face processing in the whole sample applying a five-folds cross-validation scheme. Again, for visualization purposes subjects were colour coded as belonging to the healthy and distressed samples. (D) Same plot as in (C) but for visualization purposed the subjects were colour coded according to the categorically-defined diagnoses. Distressed individuals below threshold for any disorder were labelled as ‘no-diagnosis’.
Fig. 2Weight maps for GPR model predicting STAI-T based on patterns of activation to dynamic emotional face processing using a two-folds cross-validation framework on the whole sample. A: Voxel-based predictive pattern. The colour bar indicates the weight of the voxels for decoding the clinical score. B: Region-based pattern localization map computed from the voxel based predictive pattern displayed in Fig. 2A. The colour bar indicates the percentage of the total normalized weights that each anatomically labelled region explains.
Top 20 ranked regions according to normalized weights per region, which represent 28.6%of the total weights of the predictive function.
| Rank | Brain regions | %NW |
|---|---|---|
| 1 | Rectus_L | 2.5 |
| 2 | Occipital_Inf_L | 1.8 |
| 3 | Occipital_Inf_R | 1.8 |
| 4 | Rectus_R | 1.7 |
| 5 | Cerebelum_3_L | 1.6 |
| 6 | Fusiform_R | 1.6 |
| 7 | Cerebelum_7b_L | 1.5 |
| 8 | Frontal_Inf_Oper_L | 1.4 |
| 9 | Occipital_Mid_R | 1.4 |
| 10 | Temporal_Inf_R | 1.3 |
| 11 | Frontal_Mid_Orb_L | 1.3 |
| 12 | Cerebelum_4_5_L | 1.3 |
| 13 | Frontal_Mid_L | 1.3 |
| 14 | Frontal_Inf_Tri_L | 1.2 |
| 15 | Cerebelum_6_R | 1.2 |
| 16 | Vermis_3 | 1.2 |
| 17 | Fusiform_L | 1.2 |
| 18 | Temporal_Pole_Mid_L | 1.1 |
| 19 | Frontal_Mid_Orb_R | 1.1 |
| 20 | Frontal_Sup_Medial_R | 1.1 |
Abbreviations: Inf: Inferior; L: Left; Mid: Middle; Oper: Opercularis, Orb: Orbital; Post: Posterior; R: Right; Sup: Superior; Supp: Supplementary, Tri: Triangularis; % NW: Percentage of the total normalized weights that each anatomical region explains.