| Literature DB >> 31454980 |
Yang Tao1, Chunyan Li2, Zhifang Liang3, Haocheng Yang2, Juan Xu2.
Abstract
Electronic nose (E-nose), a kind of instrument which combines with the gas sensor and the corresponding pattern recognition algorithm, is used to detect the type and concentration of gases. However, the sensor drift will occur in realistic application scenario of E-nose, which makes a variation of data distribution in feature space and causes a decrease in prediction accuracy. Therefore, studies on the drift compensation algorithms are receiving increasing attention in the field of the E-nose. In this paper, a novel method, namely Wasserstein Distance Learned Feature Representations (WDLFR), is put forward for drift compensation, which is based on the domain invariant feature representation learning. It regards a neural network as a domain discriminator to measure the empirical Wasserstein distance between the source domain (data without drift) and target domain (drift data). The WDLFR minimizes Wasserstein distance by optimizing the feature extractor in an adversarial manner. The Wasserstein distance for domain adaption has good gradient and generalization bound. Finally, the experiments are conducted on a real dataset of E-nose from the University of California, San Diego (UCSD). The experimental results demonstrate that the effectiveness of the proposed method outperforms all compared drift compensation methods, and the WDLFR succeeds in significantly reducing the sensor drift.Entities:
Keywords: domain adaption; drift compensation; electronic nose; feature representations
Year: 2019 PMID: 31454980 PMCID: PMC6749200 DOI: 10.3390/s19173703
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Wasserstein Distance Learned Feature Representations (WDLFR) combined with the classifier.
Sensor drift benchmark dataset.
| Batch ID | Month | Acetone | Acetaldehyde | Ethanol | Ethylene | Ammonia | Toluene |
|---|---|---|---|---|---|---|---|
| Batch 1 | 1, 2 | 90 | 98 | 83 | 30 | 70 | 74 |
| Batch 2 | 3~10 | 164 | 334 | 100 | 109 | 532 | 5 |
| Batch 3 | 11~13 | 365 | 490 | 216 | 240 | 275 | 0 |
| Batch 4 | 14, 15 | 64 | 43 | 12 | 30 | 12 | 0 |
| Batch 5 | 16 | 28 | 40 | 20 | 46 | 63 | 0 |
| Batch 6 | 17~20 | 514 | 574 | 110 | 29 | 606 | 467 |
| Batch 7 | 21 | 649 | 662 | 360 | 744 | 630 | 568 |
| Batch 8 | 22, 23 | 30 | 30 | 40 | 33 | 143 | 18 |
| Batch 9 | 24, 30 | 61 | 55 | 100 | 75 | 78 | 101 |
| Batch 10 | 36 | 600 | 600 | 600 | 600 | 600 | 600 |
Figure 2Two-dimensional principle component (PC1, PC2) scatter points of 10 batches data by principal component analysis (PCA).
Figure 3Two-dimensional principle component scatter points of the source and target domain feature representations after using the proposed WDLFR method.
Recognition Accuracy (%) under Experimental Setting 1. The bold font represents the highest recognition accuracy of a batch in all compared algorithms.
| Methods | Batch ID | Average | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|
| 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | ||
| PCASVM | 82.40 | 84.80 | 80.12 | 75.13 | 73.57 | 56.16 | 48.64 | 67.45 | 49.14 | 68.60 |
| LDASVM | 47.27 | 57.76 | 50.93 | 62.44 | 41.48 | 37.42 | 68.3 | 52.34 | 31.17 | 49.91 |
| SVM-rbf | 74.36 | 61.03 | 50.93 | 18.27 | 28.26 | 28.81 | 20.07 | 34.26 | 34.47 | 38.94 |
| SVM-comgfk | 74.47 | 70.15 | 59.78 | 75.09 | 73.99 | 54.59 | 55.88 | 70.23 | 41.85 | 64.00 |
| DRCA |
| 92.69 |
|
| 86.52 | 60.25 | 62.24 | 72.34 | 52.00 | 77.63 |
| WDLRF | 86.41 |
| 80.75 | 93.40 |
|
|
|
|
|
|
Figure 4Recognition accuracy bar chart under Experimental Setting 1 and Setting 2.
Corresponding Parameter Setting (mini-batch size) of the WDLFR under Experimental Setting 1.
| BatchID | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
|---|---|---|---|---|---|---|---|---|---|
| Mini-batch size | 12 | 12 | 32 | 16 | 32 | 64 | 14 | 16 | 16 |
Recognition Accuracy (%) under Experimental Setting 2. The bold font represents the highest recognition accuracy of a batch in all compared algorithms.
| Methods | Batch ID | Average | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|
| 1 → 2 | 2 → 3 | 3 → 4 | 4 → 5 | 5 → 6 | 6 → 7 | 7 → 8 | 8 → 9 | 9 → 10 | ||
| PCA | 82.40 |
| 83.23 | 72.59 | 36.70 | 74.98 | 58.16 | 84.04 | 30.61 | 69.06 |
| LDASVM | 47.27 | 46.72 | 70.81 | 85.28 | 48.87 | 75.15 | 77.21 | 62.77 | 30.25 | 60.48 |
| SVM-rbf | 74.36 | 87.83 | 90.06 | 56.35 | 42.52 | 83.53 |
| 62.98 | 22.64 | 68.01 |
| SVM-comgfk | 74.47 | 73.75 | 78.51 | 64.26 | 69.97 | 77.69 | 82.69 | 85.53 | 17.76 | 69.40 |
| DRCA | 89.15 | 98.11 | 95.03 | 69.54 | 50.87 | 78.94 | 65.99 | 84.04 | 36.31 | 74.22 |
| WDLFR | 86.41 | 92.13 |
|
|
|
| 89.12 |
|
|
|
Corresponding Parameter Setting (mini-batch size) of the WDLFR under Experimental Setting 2.
| Batch ID | 1 → 2 | 2 → 3 | 3 → 4 | 4 → 5 | 5 → 6 | 6 → 7 | 7 → 8 | 8 → 9 | 9 → 10 |
|---|---|---|---|---|---|---|---|---|---|
| Mini-batch size | 12 | 16 | 32 | 32 | 12 | 64 | 14 | 12 | 16 |