• Publications
  • Influence
Learning Transferable Features with Deep Adaptation Networks
TLDR
A new Deep Adaptation Network (DAN) architecture is proposed, which generalizes deep convolutional neural network to the domain adaptation scenario and can learn transferable features with statistical guarantees, and can scale linearly by unbiased estimate of kernel embedding. Expand
Transfer Feature Learning with Joint Distribution Adaptation
TLDR
JDA aims to jointly adapt both the marginal distribution and conditional distribution in a principled dimensionality reduction procedure, and construct new feature representation that is effective and robust for substantial distribution difference. Expand
Conditional Adversarial Domain Adaptation
TLDR
Conditional adversarial domain adaptation is presented, a principled framework that conditions the adversarial adaptation models on discriminative information conveyed in the classifier predictions to guarantee the transferability. Expand
Deep Transfer Learning with Joint Adaptation Networks
TLDR
JAN is presented, which learn a transfer network by aligning the joint distributions of multiple domain-specific layers across domains based on a joint maximum mean discrepancy (JMMD) criterion. Expand
Unsupervised Domain Adaptation with Residual Transfer Networks
TLDR
Empirical evidence shows that the new approach to domain adaptation in deep networks that can jointly learn adaptive classifiers and transferable features from labeled data in the source domain and unlabeledData in the target domain outperforms state of the art methods on standard domain adaptation benchmarks. Expand
HashNet: Deep Learning to Hash by Continuation
TLDR
HashNet is presented, a novel deep architecture for deep learning to hash by continuation method with convergence guarantees, which learns exactly binary hash codes from imbalanced similarity data. Expand
Transfer Joint Matching for Unsupervised Domain Adaptation
TLDR
This paper aims to reduce the domain difference by jointly matching the features and reweighting the instances across domains in a principled dimensionality reduction procedure, and construct new feature representation that is invariant to both the distribution difference and the irrelevant instances. Expand
Deep Hashing Network for Efficient Similarity Retrieval
TLDR
A novel Deep Hashing Network (DHN) architecture for supervised hashing is proposed, in which good image representation tailored to hash coding and formally control the quantization error are jointly learned. Expand
Semantics-preserving hashing for cross-view retrieval
TLDR
This paper proposes an effective Semantics-Preserving Hashing method, termed SePH, which transforms semantic affinities of training data as supervised information into a probability distribution and approximates it with to-be-learnt hash codes in Hamming space via minimizing the Kullback-Leibler divergence. Expand
Adaptation Regularization: A General Framework for Transfer Learning
TLDR
A novel transfer learning framework, referred to as Adaptation Regularization based Transfer Learning (ARTL), to model adaptive classifiers in a unified way based on the structural risk minimization principle and the regularization theory, and can significantly outperform state-of-the-art learning methods on several public text and image datasets. Expand
...
1
2
3
4
5
...