A Unified Framework for Quadratic Measures of Independence

@article{Seth2011AUF,
  title={A Unified Framework for Quadratic Measures of Independence},
  author={Sohan Seth and Murali Rao and Il Memming Park and Jos{\'e} Carlos Pr{\'i}ncipe},
  journal={IEEE Transactions on Signal Processing},
  year={2011},
  volume={59},
  pages={3624-3635}
}
  • S. Seth, M. Rao, J. Príncipe
  • Published 1 August 2011
  • Computer Science, Mathematics
  • IEEE Transactions on Signal Processing
This paper proposes a unified framework for several available measures of independence by generalizing the concept of information theoretic learning (ITL). The key component of ITL is the use of inner product between two density functions as a measure of similarity between two random variables. We show that by generalizing the inner product using a symmetric strictly positive-definite kernel and by choosing appropriate kernels, it is possible to reproduce a number of popular measures of… 

Figures and Tables from this paper

Measures of Entropy From Data Using Infinitely Divisible Kernels
TLDR
A framework to nonparametrically obtain measures of entropy directly from data using operators in reproducing kernel Hilbert spaces defined by infinitely divisible kernels is presented and estimators of kernel-based conditional entropy and mutual information are also defined.
Reproducing kernel hilbert space methods for information theoretic learning
TLDR
This work develops a framework for information theoretic learning based on infinitely divisible matrices and introduces an entropy-like functional on positive definite matrices based on Renyi's definition and examines some key properties of this functional that lead to the concept of infinite divisibility.
A novel formulation of Independence Detection based on the Sample Characteristic Function
A novel independence test for continuous random sequences is proposed in this paper. The test is based on seeking for coherence in a particular fixed-dimension feature space based on a uniform
Regularized Estimation of Information via High Dimensional Canonical Correlation Analysis
TLDR
The squared-loss mutual information is chosen for that purpose as a natural surrogate of Shannon mutual information, allowing resorting to Szego's theorem to reduce the complexity for high dimensional mappings, exhibiting strong dualities with spectral analysis.
Information Theoretic Learning with Infinitely Divisible Kernels
TLDR
An entropy-like functional on positive definite matrices based on Renyi's axiomatic definition of entropy is formulated and some key properties of this functional that lead to the concept of infinite divisibility are examined.
Universal Dependency Analysis
TLDR
This paper definesuds based on cumulative entropy and proposes a principled normalization scheme to bring its scores across different subspaces to the same domain, enabling universal correlation assessment, and introduces an efficient and non-parametric method to compute it.
A parameter-free kernel design based on cumulative distribution function for correntropy
TLDR
This paper proposes a parameter-free kernel that is translation invariant and positive definite that is used to define the autocorrentropy function, which is a generalized similarity measure, and spectral density estimator.
Descent Algorithms on Oblique Manifold for Source-Adaptive ICA Contrast
A Riemannian manifold optimization strategy is proposed to facilitate the relaxation of the orthonormality constraint in a more natural way in the course of performing independent component analysis
A Near-Linear Time Subspace Search Scheme for Unsupervised Selection of Correlated Features
A Derivative-Free Riemannian Powell’s Method, Minimizing Hartley-Entropy-Based ICA Contrast
TLDR
Powell's derivative-free optimization method has been extended to a Riemannian manifold for the recovery of quasi-correlated sources by minimizing this contrast function, and has been demonstrated to converge faster than the related algorithms in the literature.
...
1
2
3
...

References

SHOWING 1-10 OF 29 REFERENCES
Kernel Methods for Measuring Independence
TLDR
Two new functionals, the constrained covariance and the kernel mutual information, are introduced to measure the degree of independence of random variables and it is proved that when the RKHSs are universal, both functionals are zero if and only if the random variables are pairwise independent.
A Reproducing Kernel Hilbert Space Framework for Information-Theoretic Learning
TLDR
All the statistical descriptors in the original information-theoretic learning formulation can be rewritten as algebraic computations on deterministic functional vectors in the ITL RKHS, instead of limiting the functional view to the estimators as is commonly done in kernel methods.
Nonparametric Tests for Serial Independence Based on Quadratic Forms
TLDR
The aim of this paper is to introduce tests for serial independence using kernel-based quadratic forms, which separates the problem of consistently estimating the divergence measure from that of consistently estimates the underlying joint densities, the existence of which is no longer required.
A test of independence based on a generalized correlation function
Kernel Choice and Classifiability for RKHS Embeddings of Probability Distributions
TLDR
It is established that MMD corresponds to the optimal risk of a kernel classifier, thus forming a natural link between the distance between distributions and their ease of classification, and a generalization of the MMD is proposed for families of kernels.
Fast and robust fixed-point algorithms for independent component analysis
  • A. Hyvärinen
  • Computer Science, Mathematics
    IEEE Trans. Neural Networks
  • 1999
TLDR
Using maximum entropy approximations of differential entropy, a family of new contrast (objective) functions for ICA enable both the estimation of the whole decomposition by minimizing mutual information, and estimation of individual independent components as projection pursuit directions.
Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives
  • J. Príncipe
  • Computer Science
    Information Theoretic Learning
  • 2010
TLDR
Students, practitioners and researchers interested in statistical signal processing, computational intelligence, and machine learning will find the theory to understand the basics, the algorithms to implement applications, and exciting but still unexplored leads that will provide fertile ground for future research in this book.
Independent component analysis, A new concept?
  • P. Comon
  • Computer Science
    Signal Process.
  • 1994
ICA Using Spacings Estimates of Entropy
TLDR
A new algorithm for the independent components analysis (ICA) problem based on an efficient entropy estimator that is simple, computationally efficient, intuitively appealing, and outperforms other well known algorithms.
Kernel independent component analysis
  • F. Bach, Michael I. Jordan
  • Computer Science
    2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03).
  • 2003
TLDR
A class of algorithms for independent component analysis which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space is presented, showing that these algorithms outperform many of the presently known algorithms.
...
1
2
3
...