| Literature DB >> 22007195 |
Jarosław Szkoła1, Krzysztof Pancerz, Jan Warchoł.
Abstract
The main goal of this paper is to give the basis for creating a computer-based clinical decision support (CDS) system for laryngopathies. One of approaches which can be used in the proposed CDS is based on the speech signal analysis using recurrent neural networks (RNNs). RNNs can be used for pattern recognition in time series data due to their ability of memorizing some information from the past. The Elman networks (ENs) are a classical representative of RNNs. To improve learning ability of ENs, we may modify and combine them with another kind of RNNs, namely, with the Jordan networks. The modified Elman-Jordan networks (EJNs) manifest a faster and more exact achievement of the target pattern. Validation experiments were carried out on speech signals of patients from the control group and with two kinds of laryngopathies.Entities:
Mesh:
Year: 2011 PMID: 22007195 PMCID: PMC3189461 DOI: 10.1155/2011/289398
Source DB: PubMed Journal: Comput Intell Neurosci
Figure 1A structure of the trained Elman neural network.
Figure 2A structure of the trained Elman-Jordan neural network.
Algorithm 1Algorithm for calculating an average mean squared error corresponding to deformations in a speech signal.
Figure 3The block diagram of the process of the experiment.
Selected results of experiments for women obtained using the Elman network.
| Patient ID | Original signal | Ddifferentiated signal | ||
|---|---|---|---|---|
|
|
|
|
| |
|
| ||||
|
| 0.0068 | 389 | 0.0245 | 455 |
|
| 2.4523 | 335 | 0.0208 | 650 |
|
| 0.017 | 501 | 0.0341 | 497 |
|
| 0.0109 | 597 | 0.01 | 422 |
|
| 0.0332 | 662 | 0.0566 | 650 |
|
| 0.0178 | 609 | 0.0324 | 656 |
|
| 0.0096 | 428 | 0.0202 | 333 |
|
| 0.0068 | 318 | 0.028 | 575 |
|
| 0.008 | 490 | 0.0216 | 925 |
|
| 0.0084 | 553 | 0.05 | 504 |
|
| ||||
|
| 0.172 | 331 | 0.1081 | 564 |
|
| 0.2764 | 536 | 0.1936 | 622 |
|
| 0.0518 | 566 | 0.0533 | 593 |
|
| 0.0268 | 504 | 0.0879 | 498 |
|
| 0.0418 | 646 | 0.1726 | 547 |
|
| 0.2107 | 444 | 0.2468 | 506 |
|
| 0.0921 | 1040 | 0.1687 | 439 |
|
| 0.0364 | 992 | 0.1396 | 758 |
|
| 0.038 | 541 | 0.1061 | 826 |
|
| 0.1461 | 363 | 0.2448 | 711 |
|
| ||||
|
| 0.039 | 360 | 0.055 | 487 |
|
| 0.1006 | 452 | 0.1 | 729 |
|
| 0.1021 | 446 | 0.1583 | 608 |
|
| 0.0636 | 780 | 0.0804 | 586 |
|
| 0.1626 | 446 | 0.2376 | 545 |
|
| 0.1953 | 477 | 0.1905 | 500 |
|
| 0.2027 | 337 | 0.1661 | 378 |
|
| 0.1927 | 457 | 0.1367 | 717 |
|
| 0.2908 | 939 | 0.2139 | 865 |
|
| 0.4357 | 679 | 0.3795 | 820 |
Selected results of experiments for women obtained using the modified Elman-Jordan network.
| Patient ID | Original signal | Ddifferentiated signal | ||
|---|---|---|---|---|
|
|
|
|
| |
|
| ||||
|
| 0.0061 | 88 | 0.0228 | 103 |
|
| 0.0111 | 92 | 0.0193 | 90 |
|
| 0.0178 | 107 | 0.0347 | 117 |
|
| 0.0115 | 96 | 0.0086 | 35 |
|
| 0.0301 | 146 | 0.0537 | 123 |
|
| 0.0166 | 104 | 0.0328 | 76 |
|
| 0.0086 | 78 | 0.0201 | 178 |
|
| 0.0068 | 108 | 0.0248 | 116 |
|
| 0.008 | 162 | 0.0204 | 106 |
|
| 0.0087 | 119 | 0.0494 | 76 |
|
| ||||
|
| 0.1677 | 92 | 0.1042 | 204 |
|
| 0.3107 | 191 | 0.2108 | 47 |
|
| 0.0542 | 96 | 0.0545 | 97 |
|
| 0.0258 | 142 | 0.0853 | 144 |
|
| 0.0423 | 239 | 0.1716 | 119 |
|
| 0.2134 | 71 | 0.2428 | 86 |
|
| 0.0877 | 40 | 0.1648 | 109 |
|
| 0.0351 | 72 | 0.1362 | 132 |
|
| 0.037 | 180 | 0.105 | 123 |
|
| 0.1411 | 160 | 0.2382 | 96 |
|
| ||||
|
| 0.0395 | 148 | 0.0534 | 117 |
|
| 0.097 | 99 | 0.0991 | 96 |
|
| 0.1053 | 115 | 0.1583 | 117 |
|
| 0.0628 | 36 | 0.0784 | 70 |
|
| 0.1596 | 133 | 0.2332 | 116 |
|
| 0.1951 | 95 | 0.1945 | 90 |
|
| 0.1954 | 51 | 0.1669 | 177 |
|
| 0.191 | 99 | 0.1358 | 120 |
|
| 0.281 | 106 | 0.2084 | 100 |
|
| 0.4366 | 65 | 0.3746 | 77 |
The input data to be classified (fragment).
| Patient ID |
|
|
|
|---|---|---|---|
|
| 0.0061 | 0.0228 |
|
|
| 0.0111 | 0.0193 |
|
| ⋮ | ⋮ | ⋮ | ⋮ |
|
| 0.1677 | 0.1042 |
|
|
| 0.3107 | 0.2108 |
|
| ⋮ | ⋮ | ⋮ | ⋮ |
|
| 0.0395 | 0.0534 |
|
|
| 0.097 | 0.0991 |
|
| ⋮ | ⋮ | ⋮ | ⋮ |
Figure 4A decision tree obtained using the J4.8 algorithm.