• Publications
  • Influence
Learning Transferable Features with Deep Adaptation Networks
TLDR
A new Deep Adaptation Network (DAN) architecture is proposed, which generalizes deep convolutional neural network to the domain adaptation scenario and can learn transferable features with statistical guarantees, and can scale linearly by unbiased estimate of kernel embedding.
Transfer Feature Learning with Joint Distribution Adaptation
TLDR
JDA aims to jointly adapt both the marginal distribution and conditional distribution in a principled dimensionality reduction procedure, and construct new feature representation that is effective and robust for substantial distribution difference.
Conditional Adversarial Domain Adaptation
TLDR
Conditional adversarial domain adaptation is presented, a principled framework that conditions the adversarial adaptation models on discriminative information conveyed in the classifier predictions to guarantee the transferability.
Deep Transfer Learning with Joint Adaptation Networks
TLDR
JAN is presented, which learn a transfer network by aligning the joint distributions of multiple domain-specific layers across domains based on a joint maximum mean discrepancy (JMMD) criterion.
Unsupervised Domain Adaptation with Residual Transfer Networks
TLDR
Empirical evidence shows that the new approach to domain adaptation in deep networks that can jointly learn adaptive classifiers and transferable features from labeled data in the source domain and unlabeledData in the target domain outperforms state of the art methods on standard domain adaptation benchmarks.
HashNet: Deep Learning to Hash by Continuation
TLDR
HashNet is presented, a novel deep architecture for deep learning to hash by continuation method with convergence guarantees, which learns exactly binary hash codes from imbalanced similarity data.
Transfer Joint Matching for Unsupervised Domain Adaptation
TLDR
This paper aims to reduce the domain difference by jointly matching the features and reweighting the instances across domains in a principled dimensionality reduction procedure, and construct new feature representation that is invariant to both the distribution difference and the irrelevant instances.
Deep Hashing Network for Efficient Similarity Retrieval
TLDR
A novel Deep Hashing Network (DHN) architecture for supervised hashing is proposed, in which good image representation tailored to hash coding and formally control the quantization error are jointly learned.
Partial Adversarial Domain Adaptation
TLDR
This paper presents Partial Adversarial Domain Adaptation (PADA), which simultaneously alleviates negative transfer by down-weighing the data of outlier source classes for training both source classifier and domain adversary, and promotes positive transfer by matching the feature distributions in the shared label space.
Adaptation Regularization: A General Framework for Transfer Learning
TLDR
A novel transfer learning framework, referred to as Adaptation Regularization based Transfer Learning (ARTL), to model adaptive classifiers in a unified way based on the structural risk minimization principle and the regularization theory, and can significantly outperform state-of-the-art learning methods on several public text and image datasets.
...
1
2
3
4
5
...