Characteristic and Universal Tensor Product Kernels

@article{Szab2017CharacteristicAU,
  title={Characteristic and Universal Tensor Product Kernels},
  author={Zolt{\'a}n Szab{\'o} and Bharath K. Sriperumbudur},
  journal={ArXiv},
  year={2017},
  volume={abs/1708.08157}
}
Maximum mean discrepancy (MMD), also called energy distance or N-distance in statistics and Hilbert-Schmidt independence criterion (HSIC), specifically distance covariance in statistics, are among the most popular and successful approaches to quantify the difference and independence of random variables, respectively. Thanks to their kernel-based foundations, MMD and HSIC are applicable on a wide variety of domains. Despite their tremendous success, quite little is known about when HSIC… 

Figures and Tables from this paper

A Measure-Theoretic Approach to Kernel Conditional Mean Embeddings

A new operator-free, measure-theoretic definition of the conditional mean embedding as a random variable taking values in a reproducing kernel Hilbert space is presented, and a thorough analysis of its properties, including universal consistency is provided.

On Distance and Kernel Measures of Conditional Independence

The distance and kernel measures of conditional independence are shown to be not quite equivalent unlike in the case of joint independence as shown by Sejdinovic et al. (2013).

Self-Supervised Learning with Kernel Dependence Maximization

Self-Supervised Learning with the HilbertSchmidt Independence Criterion (SSL-HSIC), which maximizes dependence between representations of transformations of an image and the image identity, while minimizing the kernelized variance of those representations.

Learning Inconsistent Preferences with Kernel Methods

A probabilistic kernel approach for preferential learning from pairwise duelling data using Gaussian Processes that can capture more expressive latent preferential structures such as inconsistent preferences and clusters of comparable items.

Measuring Association on Topological Spaces Using Kernels and Geometric Graphs

In this paper we propose and study a class of simple, nonparametric, yet interpretable measures of association between two random variables $X$ and $Y$ taking values in general topological spaces.

Differentially Private Mean Embeddings with Random Features (DP-MERF) for Simple & Practical Synthetic Data Generation

This work presents a differentially private data generation paradigm using random feature representations of kernel mean embeddings when comparing the distribution of true data with that of synthetic data and achieves better privacy-utility trade-offs than existing methods tested on several datasets.

Consistency of permutation tests of independence using distance covariance, HSIC and dHSIC

This work provides a simple proof that the permutation test with the test statistic HSIC or dHSIC is indeed consistent when using characteristic kernels, and proves that under each alternative hypothesis, the power of these permutation tests indeed converges to 1 as the sample size converged to infinity.

Gaussian Processes and Kernel Methods: A Review on Connections and Equivalences

This paper is an attempt to bridge the conceptual gaps between researchers working on the two widely used approaches based on positive definite kernels: Bayesian learning or inference using Gaussian

Estimating Rényi's α-Cross-Entropies in a Matrix-Based Way

This work considers function-based formulations of cross entropy that sidesteps this a priori estimation requirement and proposes three measures of Rényi’s α-cross-entropies in the setting of reproducing-kernel Hilbert spaces, proving that these measures can be estimated in an unbiased, non-parametric, and minimax-optimal way.

Discussion of 'Multiscale Fisher's Independence Test for Multivariate Dependence'

We discuss how MultiFIT, the Multiscale Fisher’s Independence Test for Multivariate Dependence proposed by Gorsky and Ma (2022), compares to existing linear-time kernel tests based on the

References

SHOWING 1-10 OF 59 REFERENCES

On the relation between universality, characteristic kernels and RKHS embedding of measures

The main contribution of this paper is to clarify the relation between universal and characteristic kernels by presenting a unifying study relating them to RKHS embedding of measures, in addition to clarifying their relation to other common notions of strictly pd, conditionally strictly pD and integrally strictlypd kernels.

Strictly proper kernel scores and characteristic kernels on compact spaces

Equivalence of distance-based and RKHS-based statistics in hypothesis testing

It is shown that the energy distance most commonly employed in statistics is just one member of a parametric family of kernels, and that other choices from this family can yield more powerful tests.

Measuring Statistical Dependence with Hilbert-Schmidt Norms

We propose an independence criterion based on the eigen-spectrum of covariance operators in reproducing kernel Hilbert spaces (RKHSs), consisting of an empirical estimate of the Hilbert-Schmidt norm

Hilbert Space Embeddings and Metrics on Probability Measures

It is shown that the distance between distributions under γk results from an interplay between the properties of the kernel and the distributions, by demonstrating that distributions are close in the embedding space when their differences occur at higher frequencies.

Kernel Mean Embedding of Distributions: A Review and Beyonds

A comprehensive review of existing work and recent advances in the Hilbert space embedding of distributions, and to discuss the most challenging issues and open problems that could lead to new research directions.

Kernel Distribution Embeddings: Universal Kernels, Characteristic Kernels and Kernel Metrics on Distributions

Systemise and extend various (partly known) equivalences between different notions of universal, characteristic and strictly positive definite kernels, and show that on an underlying locally compact Hausdorff space, $d_k$ metrises the weak convergence of probability measures if and only if $k$ is continuous and characteristic.

Kernel Methods for Measuring Independence

Two new functionals, the constrained covariance and the kernel mutual information, are introduced to measure the degree of independence of random variables and it is proved that when the RKHSs are universal, both functionals are zero if and only if the random variables are pairwise independent.

Kernel Measures of Conditional Dependence

A new measure of conditional dependence of random variables, based on normalized cross-covariance operators on reproducing kernel Hilbert spaces, which has a straightforward empirical estimate with good convergence behaviour.

Kernel dimension reduction in regression

We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives directly from the formulation of SDR in terms of the conditional independence of the covariate X from
...