| Literature DB >> 28031031 |
Shun Guo1,2, Qingshan Jiang2, Lifei Chen3, Donghui Guo4.
Abstract
BACKGROUND: Inferring the topology of gene regulatory networks (GRNs) from microarray gene expression data has many potential applications, such as identifying candidate drug targets and providing valuable insights into the biological processes. It remains a challenge due to the fact that the data is noisy and high dimensional, and there exists a large number of potential interactions.Entities:
Keywords: Ensemble; Gene Regulatory Network inference; Gene expression data; Partial least squares (PLS)
Mesh:
Year: 2016 PMID: 28031031 PMCID: PMC5192600 DOI: 10.1186/s12859-016-1398-6
Source DB: PubMed Journal: BMC Bioinformatics ISSN: 1471-2105 Impact factor: 3.169
The computational complexity of different GRN inference methods
| Method | Complexity |
|---|---|
| GENIE3 |
|
| TIGRESS |
|
| CLR |
|
| ARACNE |
|
| NIMEFI |
|
| PLSNET |
|
The computational complexity of PLSNET and other GRN inference methods with respect to the number of genes P, the number of iterations T and the number of samples N
Datasets
| Network | # Genes | # Regulatory genes | #Samples | # Verified interactions |
|---|---|---|---|---|
| DREAM5 Network 1 (in-silico) | 1643 | 195 | 805 | 4012 |
| DREAM5 Network 3 (E. coli) | 4511 | 334 | 805 | 2066 |
| DREAM5 Network 4 (S. cerevisiae) | 5950 | 333 | 536 | 3940 |
| DREAM4 Multifactorial Network 1 | 100 | 100 | 100 | 176 |
| DREAM4 Multifactorial Network 2 | 100 | 100 | 100 | 249 |
| DREAM4 Multifactorial Network 3 | 100 | 100 | 100 | 195 |
| DREAM4 Multifactorial Network 4 | 100 | 100 | 100 | 211 |
| DREAM4 Multifactorial Network 5 | 100 | 100 | 100 | 193 |
Performance comparisons of different GRN inference methods on the DREAM4 networks, challenge size 100 Multifactorial
| Method | Network 1 | Network2 | Network 3 | Network 4 | Network 5 | Overall Score | |||||
|---|---|---|---|---|---|---|---|---|---|---|---|
| AUPR | AUROC | AUPR | AUROC | AUPR | AUROC | AUPR | AUROC | AUPR | AUROC | ||
| GENIE3 |
| 0.750 | 0.154 | 0.734 | 0.234 | 0.776 | 0.211 | 0.800 | 0.200 | 0.795 | 38.033 |
| TIGRESS | 0.158 | 0.747 | 0.161 | 0.703 | 0.233 | 0.761 | 0.225 | 0.774 | 0.233 | 0.754 | 36.590 |
| CLR | 0.143 | 0.701 | 0.117 | 0.695 | 0.174 | 0.744 | 0.181 | 0.753 | 0.175 | 0.723 | 29.112 |
| ARACNE | 0.122 | 0.605 | 0.102 | 0.603 | 0.201 | 0.691 | 0159 | 0.713 | 0.167 | 0.661 | 23.478 |
| NIMEFI | 0.157 |
| 0.157 | 0.731 |
| 0.776 | 0.225 | 0.806 |
|
| 40.762 |
| PLSNET | 0.118 | 0.713 |
|
| 0.202 |
|
|
| 0.206 | 0.786 |
|
| Winner of the Challenge | |||||||||||
| GENIE3 | 0.154 | 0.745 | 0.155 | 0.733 | 0.231 | 0.775 | 0.208 | 0.791 | 0.197 | 0.798 | 37.428 |
| 2nd | 0.108 | 0.739 | 0.147 | 0.694 | 0.185 | 0.748 | 0.161 | 0.736 | 0.111 | 0.745 | 28.165 |
| 3rd | 0.140 | 0.658 | 0.098 | 0.626 | 0.215 | 0.717 | 0.201 | 0.693 | 0.194 | 0.719 | 27.053 |
The best results for each column are in bold. Numbers in the “Winner of competition” part of the table correspond to the best methods participating in the challenge as listed on the DREAM web site
Fig. 1Boxplots of Overall Score on DREAM4 Multifactorial Networks with respect to the number of candidate regulatory genes
Fig. 2Boxplots of Overall Score on DREAM4 Multifactorial Networks with respect to the parameter of PLSNET
Fig. 3Boxplots of Overall Score on DREAM4 Multifactorial Networks with respect to the parameters of TIGRESS
Fig. 4Boxplots of Overall Score on DREAM4 Multifactorial Networks with respect to the parameter of ARACNE
Performance comparisons of different GRN inference methods on the DREAM5 networks
| Method | Network 1 | Network 3 | Network 4 | Overall Score | |||
|---|---|---|---|---|---|---|---|
| AUPR | AUROC | AUPR | AUROC | AUPR | AUROC | ||
| GENIE3 | 0.291 | 0.814 | 0.094 | 0.618 | 0.021 | 0.517 | 40.313 |
| TIGRESS |
| 0.783 | 0.070 | 0.596 | 0.020 | 0.517 | 31.112 |
| CLR | 0.254 | 0.771 | 0.075 | 0.591 | 0.020 | 0.516 | 19.387 |
| ARACNE | 0.187 | 0.763 | 0.069 | 0.572 | 0.018 | 0.504 | 9.24 |
| NIMEFI | 0.298 | 0.817 | 0.101 | 0.625 | 0.022 | 0.518 | 46.015 |
| PLSNET | 0.270 |
| 0.065 | 0.577 |
|
|
|
| Winner of the Challenge | |||||||
| GENIE3 | 0.291 | 0.815 | 0.093 | 0.617 | 0.021 | 0.518 | 40.279 |
| ANOVerence | 0.245 | 0.780 |
|
| 0.022 |
| 34.023 |
| TIGRESS | 0.301 | 0.782 | 0.069 | 0.595 | 0.020 | 0.517 | 31.099 |
The best results for each column are in bold. Numbers in the “Winner of competition” part of the table correspond to the best methods participating in the challenge as listed on the DREAM web site
Comparisons of running times of different GRN inference methods
| Method | CPU time (in seconds) | |||
|---|---|---|---|---|
| DREAM4 (the average of 5 networks) | DREAM5 Network 1 | DREAM4 Network 3 | DREAM4 Network 4 | |
| GENIE3 | 47.73 | 3.51E + 4 | 1.36E + 5 | 1.17E + 5 |
| TIGRESS | 160.41 | 3.06E + 4 | 9.08E + 4 | 7.02E + 4 |
| PLSNET | 136.71 | 4.22E + 3 | 1.66E + 4 | 2.09E + 4 |