• Corpus ID: 218869548

Feature Robust Optimal Transport for High-dimensional Data

@article{Petrovich2020FeatureRO,
  title={Feature Robust Optimal Transport for High-dimensional Data},
  author={Mathis Petrovich and Chao Liang and Yanbin Liu and Yao-Hung Hubert Tsai and Linchao Zhu and Yi Yang and Ruslan Salakhutdinov and Makoto Yamada},
  journal={ArXiv},
  year={2020},
  volume={abs/2005.12123}
}
Optimal transport is a machine learning problem with applications including distribution comparison, feature selection, and generative adversarial networks. In this paper, we propose feature robust optimal transport (FROT) for high-dimensional data, which jointly solves feature selection and OT problems. Specifically, we formulate the FROT problem as a min--max optimization problem. Then, we propose a convex formulation of FROT and solve it with the Frank--Wolfe-based optimization algorithm… 
3 Citations

Figures and Tables from this paper

Re-evaluating Word Mover's Distance

TLDR
An analogy between WMD and L1-normalized BOW is introduced and it is shown that not only the performance of WMD but also the distance values resemble those of BOW in high dimensional spaces.

Equitable and Optimal Transport with Multiple Agents

TLDR
This work introduces an extension of the Optimal Transport problem when multiple costs are involved, and provides an entropic regularization of that problem which leads to an alternative algorithm faster than the standard linear program.

Efficient Robust Optimal Transport with Application to Multi-Label Classification

TLDR
This work proposes a novel OT formulation that takes feature correlations into account while learning the transport plan between two distributions and model the feature-feature relationship via a symmetric positive semi-definite Mahalanobis metric in the OT cost function.

References

SHOWING 1-10 OF 60 REFERENCES

Semantic Correspondence as an Optimal Transport Problem

TLDR
This work solves the problem of establishing dense correspondences across semantically similar images by converting the maximization problem to the optimal transport formulation and incorporating the staircase weights into optimal transport algorithm to act as empirical distributions.

SPair-71k: A Large-scale Benchmark for Semantic Correspondence

TLDR
A new large-scale benchmark dataset of semantically paired images, SPair-71k, which contains 70,958 image pairs with diverse variations in viewpoint and scale is presented, which is significantly larger in number and contains more accurate and richer annotations.

Hyperpixel Flow: Semantic Correspondence With Multi-Layer Neural Features

TLDR
The proposed method, hyperpixel flow, sets a new state of the art on three standard benchmarks as well as a new dataset, SPair-71k, which contains a significantly larger number of image pairs than existing datasets, with more accurate and richer annotations for in-depth analysis.

Subspace Robust Wasserstein distances

TLDR
This work proposes a "max-min" robust variant of the Wasserstein distance by considering the maximal possible distance that can be realized between two measures, assuming they can be projected orthogonally on a lower $k$-dimensional subspace.

Sinkhorn Distances: Lightspeed Computation of Optimal Transport

TLDR
This work smooths the classic optimal transport problem with an entropic regularization term, and shows that the resulting optimum is also a distance which can be computed through Sinkhorn's matrix scaling algorithm at a speed that is several orders of magnitude faster than that of transport solvers.

The Earth Mover's Distance as a Metric for Image Retrieval

TLDR
This paper investigates the properties of a metric between two distributions, the Earth Mover's Distance (EMD), for content-based image retrieval, and compares the retrieval performance of the EMD with that of other distances.

A Swiss Army Knife for Minimax Optimal Transport

TLDR
This paper proposes to use a cutting-set method to solve the minimax OT problem and uses this method to define a notion of stability allowing us to select the ground metric robust to bounded perturbations.

A Kernel Statistical Test of Independence

TLDR
A novel test of the independence hypothesis for one particular kernel independence measure, the Hilbert-Schmidt independence criterion (HSIC), which outperforms established contingency table and functional correlation-based tests, and is greater for multivariate data.

A Kernel Two-Sample Test

TLDR
This work proposes a framework for analyzing and comparing distributions, which is used to construct statistical tests to determine if two samples are drawn from different distributions, and presents two distribution free tests based on large deviation bounds for the maximum mean discrepancy (MMD).

Fixed Support Tree-Sliced Wasserstein Barycenter

TLDR
Through real-world experiments, it is shown that, by using the proposed algorithm, the FS-TWB and FS-TSWB can be solved two orders of magnitude faster than the original Wasserstein barycenter.
...