| Literature DB >> 30689644 |
Abstract
Understanding expression levels of proteins and their interactions is a key factor to diagnose and explain the Down syndrome which can be considered as the most prevalent reason of intellectual disability in human beings. In the previous studies, the expression levels of 77 proteins obtained from normal genotype control mice and from trisomic Ts65Dn mice have been analyzed after training in contextual fear conditioning with and without injection of the memantine drug using statistical methods and machine learning techniques. Recent studies have also pointed out that there may be a linkage between the Down syndrome and the immune system. Thus, the research presented in this paper aim at in silico identification of proteins which are significant to the learning process and the immune system and to derive the most accurate model for classification of mice. In this paper, the features are selected by implementing forward feature selection method after preprocessing step of the dataset. Later, deep neural network, gradient boosting tree, support vector machine and random forest classification methods are implemented to identify the accuracy. It is observed that the selected feature subsets not only yield higher accuracy classification results but also are composed of protein responses which are important for the learning and memory process and the immune system.Entities:
Mesh:
Substances:
Year: 2019 PMID: 30689644 PMCID: PMC6349309 DOI: 10.1371/journal.pone.0210954
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Studied techniques in the literature.
| Techniques | |
|---|---|
| Ahmed et al [ | 3LME statistical model |
| Higuera et al [ | Unsupervised Learning(SOM)+Wilcoxon rank-sum test |
| Eicher et al [ | Supervised Learning(Linear SVM)+Wilcoxon rank-sum test |
| A. Block et al [ | 3LME statistical model |
| B. Feng et al [ | Supervised Learning(AdaBoost) |
Description of protein expression data.
| Mice | P1 | P2 | ‥ | ‥ | ‥ | P77 | Class |
|---|---|---|---|---|---|---|---|
| mouse 1 | 0.504 | 0.747 | 1.676 | c-cs-m | |||
| mouse 2 | 0.515 | 0.689 | 1.744 | c-cs-m | |||
| mouse 3 | 0.509 | 0.730 | 1.926 | c-cs-m | |||
| ‥ | |||||||
| mouse n |
Classes in the dataset.
| Class | Type of Mice | Type of Experiment | Treatment | Number of Mice | Learning Outcome |
|---|---|---|---|---|---|
| c−SC−s | Control | Shock Context | Saline | 9 | No Learning |
| c−SC−m | Control | Shock Context | Memantine | 10 | No Learning |
| c−CS−s | Control | Context Shock | Saline | 9 | Normal Learning |
| c−CS−m | Control | Context Shock | Memantine | 10 | Normal Learning |
| t−SC−s | Trisomic | Shock Context | Saline | 9 | No Learning |
| t−SC−m | Trisomic | Shock Context | Memantine | 9 | No Learning |
| t−CS−s | Trisomic | Context Shock | Saline | 7 | Failed Learning |
| t−CS−m | Trisomic | Context Shock | Memantine | 9 | Rescued Learning |
Fig 1Random forest classification algorithm.
Feature subset of normal learning.
| 1 | 0.656 | DYRK1A | |
| 2 | 0.751 | Ubiquitin | ITSN1 |
| 3 | 0.852 | pERK | |
| 4 | 0.873 | BRAF | |
| 5 | 0.905 | ||
| 6 | 0.921 | IL1B | pNUMB |
| 7 | 0.937 | BAX | |
| 8 | 0.942 | pNR2A | CDK5 |
| 9 | 0.942 | BDNF | |
| 10 | 0.942 | pJNK | GFAP |
| 11 | 0.942 | pCFOS |
Feature subset of rescued learning.
| 1 | 0.762 | DYRK1A | |
| 2 | 0.838 | S6 | pERK |
| 3 | 0.85 | ||
| 4 | 0.887 | BDNF | |
| 5 | 0.887 | pCREB | RRP1 |
| 6 | 0.9 | PKCA | GFAP |
| 7 | 0.912 | SOD1 | GluR3 |
| 8 | 0.925 | PSD95 | P3525 |
| 9 | 0.925 | pNR2A | Ubiquitin |
Feature subset of failed learning.
| 1 | 0.636 | pNR1 | |
| 2 | 0.713 | pPKCAB | APP |
| 3 | 0.775 | CAMKII | MTOR |
| 4 | 0.814 | pCAMKII | |
| 5 | 0.868 | NR2B | |
| 6 | 0.891 | DSCR1 | RAPTOR |
| 7 | 0.907 | nNOS | S6 |
| 8 | 0.915 | BAX | Tau |
| 9 | 0.93 | pCFOS | |
| 10 | 0.93 | ERK | EGR1 |
Fig 2PCA of all proteins set and selected proteins subset for successful learning.
Fig 3PCA of all proteins set and selected proteins subset for rescued learning.
Fig 4PCA of all proteins set and selected proteins subset for failed learning.
Accuracy result comparison of normal learning.
| Accuracy Result of Our Work | Previous Work Accuracy Result [ | |
|---|---|---|
| Deep Neural Network | 0.972 | 0.967 |
| Gradient Boosted Tree | 0.935 | 0.902 |
| Random Forest | 0.963 | 0.902 |
| SVM | 0.981 | 0.961 |
Accuracy result comparison of rescued learning.
| Accuracy Result of Our Work | Previous Work Accuracy Result [ | |
|---|---|---|
| Deep Neural Network | 0.971 | 0.954 |
| Gradient Boosted Tree | 0.933 | 0.892 |
| Random Forest | 0.946 | 0.883 |
| SVM | 0.971 | 0.921 |
Accuracy result comparison of failed learning.
| Accuracy Result of Our Work | Previous Work Accuracy Result [ | |
|---|---|---|
| Deep Neural Network | 0.926 | 0.921 |
| Gradient Boosted Tree | 0.879 | 0.844 |
| Random Forest | 0.892 | 0.859 |
| SVM | 0.926 | 0.910 |