# Quantum subspace alignment for domain adaptation

@article{He2020QuantumSA,
title={Quantum subspace alignment for domain adaptation},
author={Xi He and Xiaoting Wang},
journal={ArXiv},
year={2020},
volume={abs/2001.02472}
}
• Published 8 January 2020
• Computer Science
• ArXiv
Domain adaptation (DA) is used for adaptively obtaining labels of an unprocessed data set with a given related, but different labelled data set. Subspace alignment (SA), a representative DA algorithm, attempts to find a linear transformation to align the subspaces of the two different data sets. The classifier trained on the aligned labelled data set can be transferred to the unlabelled data set to predict the target labels. In this paper, two quantum versions of the SA are proposed to…
2 Citations

## Figures from this paper

### Learning Equality Constraints for Motion Planning on Manifolds

• Computer Science
CoRL
• 2020
This work considers the problem of learning representations of constraints from demonstrations with a deep neural network, which it calls Equality Constraint Manifold Neural Network (ECoMaNN), to learn a level-set function of the constraint suitable for integration into a constrained sampling-based motion planner.

## References

SHOWING 1-10 OF 87 REFERENCES

### Quantum correlation alignment for unsupervised domain adaptation

• Xi He
• Computer Science
ArXiv
• 2020
The simulation results prove that the variational quantum correlation alignment algorithm (VQCORAL) can achieve competitive performance compared with the classical CORAL.

### Subspace Distribution Alignment for Unsupervised Domain Adaptation

• Computer Science
BMVC
• 2015
A unified view of existing subspace mapping based methods is presented and a generalized approach that also aligns the distributions as well as the subspace bases is developed that shows improved results over published approaches.

### Unsupervised domain adaptation using parallel transport on Grassmann manifold

• Computer Science
IEEE Winter Conference on Applications of Computer Vision
• 2014
A novel framework based on the parallel transport of union of the source subspaces on the Grassmann manifold is developed, which allows for multiple domain shifts between the source and target domains.

### Domain adaptation for object recognition: An unsupervised approach

• Computer Science
2011 International Conference on Computer Vision
• 2011
This paper presents one of the first studies on unsupervised domain adaptation in the context of object recognition, where data has been labeled only from the source domain (and therefore do not have correspondences between object categories across domains).

### Unsupervised Visual Domain Adaptation Using Subspace Alignment

• Computer Science
2013 IEEE International Conference on Computer Vision
• 2013
This paper introduces a new domain adaptation algorithm where the source and target domains are represented by subspaces described by eigenvectors, and seeks a domain adaptation solution by learning a mapping function which aligns the source subspace with the target one.

### Subspace Interpolation via Dictionary Learning for Unsupervised Domain Adaptation

• Computer Science
2013 IEEE Conference on Computer Vision and Pattern Recognition
• 2013
This work proposes to interpolate subspaces through dictionary learning to link the source and target domains, which are able to capture the intrinsic domain shift and form a shared feature representation for cross domain recognition.

### Quantum locally linear embedding

• Computer Science
• 2019
This paper presents two implementations of the quantum locally linear embedding algorithm (qLLE) to perform the nonlinear dimensionality reduction on quantum devices and achieves an exponential speedup in O(\mathrm{poly}(\log N))\$.

### Connecting the Dots with Landmarks: Discriminatively Learning Domain-Invariant Features for Unsupervised Domain Adaptation

• Computer Science
ICML
• 2013
This paper automatically discovers the existence of landmarks and uses them to bridge the source to the target by constructing provably easier auxiliary domain adaptation tasks, and shows how this composition can be optimized discriminatively without requiring labels from the target domain.

### Geodesic flow kernel for unsupervised domain adaptation

• Computer Science
2012 IEEE Conference on Computer Vision and Pattern Recognition
• 2012
This paper proposes a new kernel-based method that takes advantage of low-dimensional structures that are intrinsic to many vision datasets, and introduces a metric that reliably measures the adaptability between a pair of source and target domains.

### Quantum variational autoencoder

• Computer Science
Quantum Science and Technology
• 2018
A quantum variational autoencoder (QVAE) is introduced: a VAE whose latent generative process is implemented as a quantum Boltzmann machine (QBM), which can be trained end-to-end by maximizing a well-defined loss-function: a ‘quantum’ lower-bound to a variational approximation of the log-likelihood.