# A Unified Framework for Quadratic Measures of Independence

@article{Seth2011AUF, title={A Unified Framework for Quadratic Measures of Independence}, author={Sohan Seth and Murali Rao and Il Memming Park and Jos{\'e} Carlos Pr{\'i}ncipe}, journal={IEEE Transactions on Signal Processing}, year={2011}, volume={59}, pages={3624-3635} }

This paper proposes a unified framework for several available measures of independence by generalizing the concept of information theoretic learning (ITL). The key component of ITL is the use of inner product between two density functions as a measure of similarity between two random variables. We show that by generalizing the inner product using a symmetric strictly positive-definite kernel and by choosing appropriate kernels, it is possible to reproduce a number of popular measures of…

## 22 Citations

Measures of Entropy From Data Using Infinitely Divisible Kernels

- Computer ScienceIEEE Transactions on Information Theory
- 2015

A framework to nonparametrically obtain measures of entropy directly from data using operators in reproducing kernel Hilbert spaces defined by infinitely divisible kernels is presented and estimators of kernel-based conditional entropy and mutual information are also defined.

Reproducing kernel hilbert space methods for information theoretic learning

- Computer Science
- 2012

This work develops a framework for information theoretic learning based on infinitely divisible matrices and introduces an entropy-like functional on positive definite matrices based on Renyi's definition and examines some key properties of this functional that lead to the concept of infinite divisibility.

A novel formulation of Independence Detection based on the Sample Characteristic Function

- Computer Science2018 26th European Signal Processing Conference (EUSIPCO)
- 2018

A novel independence test for continuous random sequences is proposed in this paper. The test is based on seeking for coherence in a particular fixed-dimension feature space based on a uniform…

Regularized Estimation of Information via High Dimensional Canonical Correlation Analysis

- Computer ScienceArXiv
- 2020

The squared-loss mutual information is chosen for that purpose as a natural surrogate of Shannon mutual information, allowing resorting to Szego's theorem to reduce the complexity for high dimensional mappings, exhibiting strong dualities with spectral analysis.

Information Theoretic Learning with Infinitely Divisible Kernels

- Computer Science, MathematicsICLR
- 2013

An entropy-like functional on positive definite matrices based on Renyi's axiomatic definition of entropy is formulated and some key properties of this functional that lead to the concept of infinite divisibility are examined.

Universal Dependency Analysis

- Computer ScienceSDM
- 2016

This paper definesuds based on cumulative entropy and proposes a principled normalization scheme to bring its scores across different subspaces to the same domain, enabling universal correlation assessment, and introduces an efficient and non-parametric method to compute it.

A parameter-free kernel design based on cumulative distribution function for correntropy

- Computer ScienceThe 2013 International Joint Conference on Neural Networks (IJCNN)
- 2013

This paper proposes a parameter-free kernel that is translation invariant and positive definite that is used to define the autocorrentropy function, which is a generalized similarity measure, and spectral density estimator.

Descent Algorithms on Oblique Manifold for Source-Adaptive ICA Contrast

- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2012

A Riemannian manifold optimization strategy is proposed to facilitate the relaxation of the orthonormality constraint in a more natural way in the course of performing independent component analysis…

A Near-Linear Time Subspace Search Scheme for Unsupervised Selection of Correlated Features

- Computer Science, MathematicsBig Data Res.
- 2014

A Derivative-Free Riemannian Powell’s Method, Minimizing Hartley-Entropy-Based ICA Contrast

- MathematicsIEEE Transactions on Neural Networks and Learning Systems
- 2016

Powell's derivative-free optimization method has been extended to a Riemannian manifold for the recovery of quasi-correlated sources by minimizing this contrast function, and has been demonstrated to converge faster than the related algorithms in the literature.

## References

SHOWING 1-10 OF 29 REFERENCES

Kernel Methods for Measuring Independence

- Computer ScienceJ. Mach. Learn. Res.
- 2005

Two new functionals, the constrained covariance and the kernel mutual information, are introduced to measure the degree of independence of random variables and it is proved that when the RKHSs are universal, both functionals are zero if and only if the random variables are pairwise independent.

A Reproducing Kernel Hilbert Space Framework for Information-Theoretic Learning

- Computer ScienceIEEE Transactions on Signal Processing
- 2008

All the statistical descriptors in the original information-theoretic learning formulation can be rewritten as algebraic computations on deterministic functional vectors in the ITL RKHS, instead of limiting the functional view to the estimators as is commonly done in kernel methods.

Nonparametric Tests for Serial Independence Based on Quadratic Forms

- Mathematics, Computer Science
- 2005

The aim of this paper is to introduce tests for serial independence using kernel-based quadratic forms, which separates the problem of consistently estimating the divergence measure from that of consistently estimates the underlying joint densities, the existence of which is no longer required.

A test of independence based on a generalized correlation function

- Computer ScienceSignal Process.
- 2011

Kernel Choice and Classifiability for RKHS Embeddings of Probability Distributions

- Computer ScienceNIPS
- 2009

It is established that MMD corresponds to the optimal risk of a kernel classifier, thus forming a natural link between the distance between distributions and their ease of classification, and a generalization of the MMD is proposed for families of kernels.

Fast and robust fixed-point algorithms for independent component analysis

- Computer Science, MathematicsIEEE Trans. Neural Networks
- 1999

Using maximum entropy approximations of differential entropy, a family of new contrast (objective) functions for ICA enable both the estimation of the whole decomposition by minimizing mutual information, and estimation of individual independent components as projection pursuit directions.

Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives

- Computer ScienceInformation Theoretic Learning
- 2010

Students, practitioners and researchers interested in statistical signal processing, computational intelligence, and machine learning will find the theory to understand the basics, the algorithms to implement applications, and exciting but still unexplored leads that will provide fertile ground for future research in this book.

ICA Using Spacings Estimates of Entropy

- Computer ScienceJ. Mach. Learn. Res.
- 2003

A new algorithm for the independent components analysis (ICA) problem based on an efficient entropy estimator that is simple, computationally efficient, intuitively appealing, and outperforms other well known algorithms.

Kernel independent component analysis

- Computer Science2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03).
- 2003

A class of algorithms for independent component analysis which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space is presented, showing that these algorithms outperform many of the presently known algorithms.