TY - GEN
T1 - Adversarial Transform Networks for Unsupervised Transfer Learning
AU - Cai, Guanyu
AU - Wang, Yuqin
AU - He, Lianghua
AU - Zhou, Mengchu
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/10/30
Y1 - 2020/10/30
N2 - Transfer learning, especially unsupervised domain adaptation, is a crucial technology for sample-efficient learning. Recently, deep adversarial domain adaptation methods perform remarkably well in various tasks, which introduce a domain classifier to promote domain-invariant representation. However, previous methods either constrain the representative ability with an identical feature extractor for both domains or ignore the relationship between domains with separate extractors. In this paper, we propose a novel adversarial domain adaptation method named Adversarial Transform Network (ATN) to both enhance the representative ability and transfer general information between domains. Residual connections are used to share features in the bottom layers, which deliver transferrable features to boost generalization performance. Moreover, a regularizer is proposed to alleviate a vanishing gradient problem, thus stabilizing the optimization procedure. Extensive experiments are conducted to show that the proposed ATN is comparable with the methods of the state-of-the-art and effectively deals with the vanishing gradient problem.
AB - Transfer learning, especially unsupervised domain adaptation, is a crucial technology for sample-efficient learning. Recently, deep adversarial domain adaptation methods perform remarkably well in various tasks, which introduce a domain classifier to promote domain-invariant representation. However, previous methods either constrain the representative ability with an identical feature extractor for both domains or ignore the relationship between domains with separate extractors. In this paper, we propose a novel adversarial domain adaptation method named Adversarial Transform Network (ATN) to both enhance the representative ability and transfer general information between domains. Residual connections are used to share features in the bottom layers, which deliver transferrable features to boost generalization performance. Moreover, a regularizer is proposed to alleviate a vanishing gradient problem, thus stabilizing the optimization procedure. Extensive experiments are conducted to show that the proposed ATN is comparable with the methods of the state-of-the-art and effectively deals with the vanishing gradient problem.
KW - Adversarial transform networks
KW - gradient vanishing problem
KW - transfer learning
UR - http://www.scopus.com/inward/record.url?scp=85096351420&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85096351420&partnerID=8YFLogxK
U2 - 10.1109/ICNSC48988.2020.9238125
DO - 10.1109/ICNSC48988.2020.9238125
M3 - Conference contribution
AN - SCOPUS:85096351420
T3 - 2020 IEEE International Conference on Networking, Sensing and Control, ICNSC 2020
BT - 2020 IEEE International Conference on Networking, Sensing and Control, ICNSC 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE International Conference on Networking, Sensing and Control, ICNSC 2020
Y2 - 30 October 2020 through 2 November 2020
ER -