# Canonical dependency analysis based on squared-loss mutual information

@article{Karasuyama2012CanonicalDA, title={Canonical dependency analysis based on squared-loss mutual information}, author={Masayuki Karasuyama and Masashi Sugiyama}, journal={Neural networks : the official journal of the International Neural Network Society}, year={2012}, volume={34}, pages={ 46-55 } }

## 25 Citations

Machine Learning with Squared-Loss Mutual Information

- Computer ScienceEntropy
- 2013

Recent development in SMI approximation based on direct density-ratio estimation and SMI-based machine learning techniques such as independence testing, dimensionality reduction, canonical dependency analysis, independent component analysis, object matching, clustering, and causal inference are reviewed.

Probabilistic CCA with Implicit Distributions

- Computer ScienceArXiv
- 2019

This work presents Conditional Mutual Information (CMI) as a new criterion for CCA to consider both linear and nonlinear dependency for arbitrarily distributed data and derives an objective which can provide an estimation for CMI with efficient inference methods.

Canonical analysis basedonmutual information

- Mathematics2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS)
- 2015

This contribution replaces (linear) correlation as the measure of association between the linear combinations with the information theoretical measure mutual information (MI), and term this type of analysis canonical information analysis (CIA).

Change detection in bi-temporal data by canonical information analysis

- Mathematics2015 8th International Workshop on the Analysis of Multitemporal Remote Sensing Images (Multi-Temp)
- 2015

This contribution replaces (linear) correlation as the measure of association between the linear combinations with the information theoretical measure mutual information (MI), and term this type of analysis canonical information analysis (CIA).

Interpretation of images from intensity, texture and geometry

- Environmental Science
- 2015

Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical…

Estimation of mutual information by the fuzzy histogram

- Computer ScienceFuzzy Optim. Decis. Mak.
- 2014

F fuzzy partitioning is suggested for the histogram-based MI estimation, which uses a general form of fuzzy membership functions, which includes the class of crisp membership functions as a special case, and it is shown that the average absolute error of the fuzzy-histogram method is less than that of the naïve histogram method.

Distance Covariance Analysis

- Computer ScienceAISTATS
- 2017

A dimensionality reduction method to identify linear projections that capture interactions between two or more sets of variables that can detect both linear and nonlinear relationships, and can take dependent variables into account is proposed.

Direct Approximation of Divergences Between Probability Distributions

- Computer ScienceEmpirical Inference
- 2013

This chapter reviews recent advances in direct divergence approximation that follow the general inference principle advocated by Vladimir Vapnik—one should not solve a more general problem as an intermediate step when approximating a divergence.

Divergence estimation for machine learning and signal processing

- Computer Science2013 International Winter Workshop on Brain-Computer Interface (BCI)
- 2013

This talk reviews recent advances in direct divergence approximation that follow the general inference principle advocated by Vladimir Vapnik and argues that the latter approximators are more useful in practice due to their computational efficiency, high numerical stability, and superior robustness against outliers.

## References

SHOWING 1-10 OF 76 REFERENCES

Canonical Correlation Analysis for Multilabel Classification: A Least-Squares Formulation, Extensions, and Analysis

- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2011

It is shown that under a mild condition which tends to hold for high-dimensional data, CCA in the multilabel case can be formulated as a least-squares problem, and several CCA extensions are proposed, including the sparse CCA formulation based on the 1-norm regularization.

Canonical correlation analysis using within-class coupling

- Computer SciencePattern Recognit. Lett.
- 2011

Multi-Label Prediction via Sparse Infinite CCA

- Computer ScienceNIPS
- 2009

A nonparametric, fully Bayesian framework that can automatically select the number of correlation components, and effectively capture the sparsity underlying the projections is proposed.

Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation

- Computer ScienceNeural Computation
- 2013

A novel sufficient dimension-reduction method using a squared-loss variant of mutual information as a dependency measure that is formulated as a minimum contrast estimator on parametric or nonparametric models and a natural gradient algorithm on the Grassmann manifold for sufficient subspace search.

Measuring Statistical Dependence with Hilbert-Schmidt Norms

- Computer Science, MathematicsALT
- 2005

We propose an independence criterion based on the eigen-spectrum of covariance operators in reproducing kernel Hilbert spaces (RKHSs), consisting of an empirical estimate of the Hilbert-Schmidt norm…

A kernel method for canonical correlation analysis

- Computer ScienceArXiv
- 2006

The effectiveness of applying kernel method to canonical correlation analysis is investigated, which shows an efficient approach to improve such a linear method.

Mutual information estimation reveals global associations between stimuli and biological processes

- BiologyBMC Bioinformatics
- 2009

A novel feature selection method called Least-Squares Mutual Information (LSMI), which computes mutual information without density estimaion, and therefore LSMI can detect nonlinear associations within a cell, allows the global organization of cellular process control.

Feature discovery under contextual supervision using mutual information

- Mathematics[Proceedings 1992] IJCNN International Joint Conference on Neural Networks
- 1992

The author considers a neural network in which the inputs may be divided into two groups, termed primary inputs and contextual inputs. The goal of the network is to discover those linear functions of…

Kernel dimension reduction in regression

- Mathematics
- 2009

We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives directly from the formulation of SDR in terms of the conditional independence of the covariate X from…