Corpus ID: 12134482

Deep Low-Rank Coding for Transfer Learning

@inproceedings{Ding2015DeepLC,
  title={Deep Low-Rank Coding for Transfer Learning},
  author={Zhengming Ding and Ming Shao and Yun Raymond Fu},
  booktitle={IJCAI},
  year={2015}
}
Recent researches on transfer learning exploit deep structures for discriminative feature representation to tackle cross-domain disparity. However, few of them are able to joint feature learning and knowledge transfer in a unified deep framework. In this paper, we develop a novel approach, called Deep Low-Rank Coding (DLRC), for transfer learning. Specifically, discriminative low-rank coding is achieved in the guidance of an iterative supervised structure term for each single layer. In this way… Expand
Deep Transfer Low-Rank Coding for Cross-Domain Learning
  • Zhengming Ding, Y. Fu
  • Computer Science, Medicine
  • IEEE Transactions on Neural Networks and Learning Systems
  • 2019
TLDR
A novel deep transfer low-rank coding based on deep convolutional neural networks, where multilayer common dictionaries shared across two domains are obtained to bridge the domain gap such that more enriched domain-invariant knowledge can be captured through a layerwise fashion. Expand
Deep Domain Adaptation
TLDR
This chapter develops three novel deep domain adaptation approaches for knowledge transfer by proposing a Deep Low-Rank Coding framework (DLRC) and a novel Deep Transfer Low-rank Coding (DTLC) framework to uncover more shared knowledge across source and target in a multi-layer manner. Expand
Task-driven deep transfer learning for image classification
TLDR
A task-driven deep transfer learning framework for image classification, where the deep feature and classifier are obtained simultaneously for optimal classification performance and the superiority of the proposed algorithm is witnessed by comparing with other ones. Expand
Deep Robust Encoder Through Locality Preserving Low-Rank Dictionary
TLDR
A novel Deep Robust Encoder (DRE) is proposed through locality preserving low-rank dictionary to extract robust and discriminative features from corrupted data, where a low- rank dictionary and a regularized deep auto-encoder are jointly optimized. Expand
Marginalized Denoising Dictionary Learning With Locality Constraint
TLDR
A unified feature learning framework is developed by incorporating the marginalized denoising auto-encoder into a locality-constrained dictionary learning scheme, named marginalizedDenoising dictionary learning, in order to learn a more concise and pure feature spaces meanwhile inheriting the discrimination from sub-dictionary learning. Expand
Robust Dictionary Learning
TLDR
This chapter focuses on building a self-taught coding framework, which can effectively utilize the rich low-level pattern information abstracted from the auxiliary domain, in order to characterize the high-level structural information in the target domain. Expand
Deep Nonlinear Feature Coding for Unsupervised Domain Adaptation
TLDR
This paper builds on the marginalized stacked denoising autoencoder (mSDA) to extract rich deep features and introduces two new elements to mSDA: domain divergence minimization by Maximum Mean Discrepancy (MMD), and nonlinear coding by kernelization. Expand
Deep Sparse Informative Transfer SoftMax for Cross-Domain Image Classification
TLDR
This paper presents a novel transfer SoftMax model called Sparse Informative Transfer SoftMax (SITS) to deal with the problem of cross-domain image classification and developed Deep SITS network to efficiently learn informative transfer model and enhance the transferable ability of deep neural network. Expand
Class-Specific Reconstruction Transfer Learning for Visual Recognition Across Domains
TLDR
This article proposes a novel class-wise reconstruction-based adaptation method called Class-specific Reconstruction Transfer Learning (CRTL), which optimizes a well modeled transfer loss function by fully exploiting intra-class dependency and inter-class independency. Expand
Spectral Bisection Tree Guided Deep Adaptive Exemplar Autoencoder for Unsupervised Domain Adaptation
TLDR
This paper extends the deep representation learning to domain adaptation scenario, and proposes a novel deep model called "Deep Adaptive Exemplar AutoEncoder (DAE2)", different from conventional de-noising autoencoders using corrupted inputs, which allows to gradually extract discriminant features layer by layer. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 50 REFERENCES
Latent Low-Rank Transfer Subspace Learning for Missing Modality Recognition
TLDR
Experimental results of multi-modalities knowledge transfer with missing target data demonstrate that the L2TSL method can successfully inherit knowledge from the auxiliary database to complete the target domain, and therefore enhance the performance when recognizing data from the modality without any training data. Expand
Hybrid Heterogeneous Transfer Learning through Deep Learning
TLDR
A deep learning approach to learn a feature mapping between cross-domain heterogeneous features as well as a better feature representation for mapped data to reduce the bias issue caused by the cross- domain correspondences. Expand
Joint Hierarchical Domain Adaptation and Feature Learning
Complex visual data contain discriminative structures that are difficult to be fully captured by any single feature descriptor. While recent work in domain adaptation focuses on adapting a singleExpand
Generalized Transfer Subspace Learning Through Low-Rank Constraint
TLDR
Extensive experiments on synthetic data, and important computer vision problems such as face recognition application and visual domain adaptation for object recognition demonstrate the superiority of the proposed approach over the existing, well-established methods. Expand
DASH-N: Joint Hierarchical Domain Adaptation and Feature Learning
TLDR
This work proposes a novel framework for domain adaptation using a sparse and hierarchical network (DASH-N), which jointly learns a hierarchy of features together with transformations that rectify the mismatch between different domains. Expand
Low-Rank Transfer Subspace Learning
TLDR
This paper proposes a novel framework to solve the knowledge transfer problem via low-rank representation constraints by finding an optimal subspace where each datum in the target domain can be linearly represented by the corresponding subspace in the source domain. Expand
Adaptation Regularization: A General Framework for Transfer Learning
TLDR
A novel transfer learning framework, referred to as Adaptation Regularization based Transfer Learning (ARTL), to model adaptive classifiers in a unified way based on the structural risk minimization principle and the regularization theory, and can significantly outperform state-of-the-art learning methods on several public text and image datasets. Expand
Transfer Sparse Coding for Robust Image Representation
TLDR
This paper aims to minimize the distribution divergence between the labeled and unlabeled images, and incorporates this criterion into the objective function of sparse coding to make the new representations robust to the distribution difference. Expand
Marginalized Denoising Auto-encoders for Nonlinear Representations
TLDR
The marginalized Denoising Auto-encoder (mDAE) is presented, which (approximately) marginalizes out the corruption during training and is able to match or outperform the DAE with much fewer training epochs. Expand
Transfer Learning with Graph Co-Regularization
TLDR
This paper proposes a general framework, referred to as Graph Co-Regularized Transfer Learning (GTL), where various matrix factorization models can be incorporated and proposes two novel methods using NMF and NMTF, respectively. Expand
...
1
2
3
4
5
...