| Literature DB >> 35479600 |
Peng Gao1,2, Jingmei Li1, Guodong Zhao1, Changhong Ding3.
Abstract
The current traditional unsupervised transfer learning assumes that the sample is collected from a single domain. From the aspect of practical application, the sample from a single-source domain is often not enough. In most cases, we usually collect labeled data from multiple domains. In recent years, multisource unsupervised transfer learning with deep learning has focused on aligning in the common feature space and then seeking to minimize the distribution difference between the source and target domains, such as marginal distribution, conditional distribution, or both. Moreover, conditional distribution and marginal distribution are often treated equally, which will lead to poor performance in practical applications. The existing algorithms that consider balanced distribution are often based on a single-source domain. To solve the above-mentioned problems, we propose a multisource transfer learning algorithm based on distribution adaptation. This algorithm considers adjusting the weights of two distributions to solve the problem of distribution adaptation in multisource transfer learning. A large number of experiments have shown that our method MTLBDA has achieved significant results in popular image classification datasets such as Office-31.Entities:
Mesh:
Year: 2022 PMID: 35479600 PMCID: PMC9038385 DOI: 10.1155/2022/6915216
Source DB: PubMed Journal: Comput Intell Neurosci
Figure 1Multisource transfer learning.
Figure 2The framework of multisource deep transfer learning based on balanced distribution adaptation (MTLBDA). Our model consists of three components: (i) a common feature extractor, (ii) a domain-specific distribution balancer, and (iii) a domain-specific regularization term.
MTLBDA algorithm steps.
| MTLBDA algorithm training |
|---|
| Input: N source domains |
| Output: Loss function |
| 1: Give the number of training iterations |
Figure 3Example of USPS, MNIST, and SVHN pictures.
Average classification accuracy (%) on the digital classification datasets.
| Algorithms | U- > S | M- > S | U- > M | S- > M | M- > U | S- > U |
|---|---|---|---|---|---|---|
| DAN | 68.23 | 67.84 | 97.5 | 66.91 | 93.49 | 65.33 |
| (0.43) | (0.41) | (0.62) | (0.83) | (0.85) | (1.12) | |
| DANN | 68.65 | 68.14 | 97.92 | 67.23 | 93.47 | 66.25 |
| (0.88) | (0.82) | (0.81) | (0.94) | (0.79) | (0.97) | |
| BDA | 68.36 | 67.72 | 98.13 | 67.54 | 93.62 | 65.13 |
| (0.45) | (0.39) | (0.41) | (0.73) | (0.51) | (0.82) | |
| DDAN | 70.52 | 69.53 | 98.22 | 68.95 | 94.23 | 66.89 |
| (0.61) | (0.59) | (0.46) | (0.39) | (0.41) | (0.55) | |
|
| ||||||
| U, M- > s | U, S- > m | M, S- > u | ||||
|
| ||||||
| DCTN | 77.61 | 76.83 | 96.23 | 96.85 | 92.81 | 93.08 |
| (0.41) | (0.39) | (0.82) | (0.73) | (0.47) | (0.56) | |
| MFSAN | 78.56 | 78.16 | 98.08 | 97.78 | 94.23 | 93.83 |
| (0.95) | (1.12) | (0.97) | (1.17) | (0.79) | (0.92) | |
| M3SDA | 80.25 | 79.47 | 97.48 | 97.63 | 94.67 | 95.31 |
| (0.82) | (0.75) | (0.85) | (0.91) | (0.93) | (0.96) | |
| MTLBDA | 83.56 | 82.82 | 97.91 | 98.43 | 96.14 | 94.81 |
| (0.66) | (0.41) | (0.76) | (0.68) | (0.81) | (1.03) | |
Figure 4Example of Office-31 and Caltech-256 pictures.
Average classification accuracy (%) on the image classification datasets.
| Algorithms | A- > H | W- > H | D- > H | A- > D | W- > D | H- > D |
|---|---|---|---|---|---|---|
| DAN | 81.73 | 70.87 | 77.96 | 95.71 | 98.25 | 97.12 |
| (0.89) | (1.35) | (0.77) | (0.59) | (0.55) | (0.42) | |
| DANN | 85.23 | 75.31 | 83.15 | 96.12 | 98.12 | 97.35 |
| (1.06) | (1.12) | (0.93) | (0.87) | (0.83) | (0.78) | |
| BDA | 87.95 | 80.47 | 85.73 | 95.42 | 98.46 | 97.52 |
| (0.66) | (0.78) | (0.57) | (0.64) | (0.56) | (0.55) | |
| DDAN | 89.34 | 80.22 | 86.45 | 96.83 | 98.79 | 98.17 |
| (0.94) | (0.97) | (1.04) | (0.98) | (0.93) | (1.03) | |
|
| ||||||
| A- > W | D- > W | H- > W | W- > A | D- > A | H- > A | |
|
| ||||||
| DAN | 93.47 | 96.31 | 93.69 | 88.84 | 90.27 | 90.82 |
| (0.87) | (0.42) | (0.76) | (1.08) | (1.12) | (1.23) | |
| DANN | 95.31 | 96.24 | 95.75 | 89.25 | 91.76 | 90.41 |
| (0.82) | (0.45) | (1.01) | (1.06) | (1.18) | (1.07) | |
| BDA | 96.24 | 96.14 | 95.43 | 90.55 | 91.43 | 90.73 |
| (0.62) | (0.53) | (0.75) | (0.87) | (0.85) | (0.74) | |
| DDAN | 96.52 | 96.84 | 96.26 | 90.95 | 92.13 | 91.44 |
| (1.01) | (0.99) | (0.87) | (0.82) | (0.86) | (0.99) | |
|
| ||||||
| A, W, D- > H | A, W, H- > D | |||||
|
| ||||||
| DCTN | 89.51 | 90.24 | 88.65 | 98.25 | 99.06 | 98.76 |
| (0.53) | (0.48) | (0.71) | (0.45) | (0.52) | (0.47) | |
| MFSAN | 91.43 | 90.54 | 91.17 | 99.27 | 98.03 | 98.77 |
| (0.48) | (0.69) | (0.56) | (0.58) | (0.65) | (0.52) | |
| M3SDA | 91.22 | 90.63 | 90.58 | 98.96 | 98.48 | 98.65 |
| (0.52) | (0.49) | (0.54) | (0.63) | (0.46) | (0.43) | |
| MTLBDA | 92.24 | 91.89 | 92.03 | 99.01 | 99.28 | 98.68 |
| (0.35) | (0.42) | (0.43) | (0.45) | (0.51) | (0.49) | |
|
| ||||||
| A, D, H- > W | W, D, H- > A | |||||
|
| ||||||
| DCTN | 97.67 | 98.82 | 99.03 | 92.71 | 90.37 | 91.63 |
| (0.65) | (0.57) | (0.55) | (0.67) | (0.85) | (0.76) | |
| MFSAN | 99.48 | 98.37 | 99.08 | 91.54 | 93.26 | 94.14 |
| (0.38) | (0.69) | (0.43) | (0.75) | (0.82) | (0.65) | |
| M3SDA | 99.31 | 99.15 | 98.78 | 93.72 | 92.63 | 94.26 |
| (0.48) | (0.51) | (0.62) | (0.68) | (0.59) | (0.63) | |
| MTLBDA | 99.52 | 98.63 | 98.75 | 94.53 | 92.91 | 93.86 |
| (0.47) | (0.54) | (0.61) | (0.55) | (0.72) | (0.53) | |
Figure 5Example of DomainNet pictures.
Data distribution example of DomainNet.
| Domain | Category number | Total of data | Number of data of each subcategory (example) | ||
|---|---|---|---|---|---|
| Furniture | Mammal | Tool | |||
| Clipart | 345 | 48921 | 5802 | 3437 | 3812 |
| Infographic | 345 | 53779 | 6513 | 3602 | 3096 |
| Painting | 345 | 76794 | 5002 | 8982 | 5124 |
| Quickdraw | 345 | 173500 | 17500 | 12500 | 14000 |
| Real | 345 | 70465 | 17104 | 15538 | 12938 |
| Sketch | 345 | 599859 | 7529 | 5151 | 4876 |
Figure 6Comparison on DomainNet.
Figure 7Effect of iteration time on accuracy.
Figure 8Effect of μ on accuracy.
Figure 9T-SNE plot of Caltech.