Canonical dependency analysis based on squared-loss mutual information

@article{Karasuyama2012CanonicalDA,
  title={Canonical dependency analysis based on squared-loss mutual information},
  author={Masayuki Karasuyama and Masashi Sugiyama},
  journal={Neural networks : the official journal of the International Neural Network Society},
  year={2012},
  volume={34},
  pages={
          46-55
        }
}

Figures from this paper

Machine Learning with Squared-Loss Mutual Information
TLDR
Recent development in SMI approximation based on direct density-ratio estimation and SMI-based machine learning techniques such as independence testing, dimensionality reduction, canonical dependency analysis, independent component analysis, object matching, clustering, and causal inference are reviewed.
Probabilistic CCA with Implicit Distributions
TLDR
This work presents Conditional Mutual Information (CMI) as a new criterion for CCA to consider both linear and nonlinear dependency for arbitrarily distributed data and derives an objective which can provide an estimation for CMI with efficient inference methods.
Canonical analysis basedonmutual information
TLDR
This contribution replaces (linear) correlation as the measure of association between the linear combinations with the information theoretical measure mutual information (MI), and term this type of analysis canonical information analysis (CIA).
Change detection in bi-temporal data by canonical information analysis
TLDR
This contribution replaces (linear) correlation as the measure of association between the linear combinations with the information theoretical measure mutual information (MI), and term this type of analysis canonical information analysis (CIA).
Interpretation of images from intensity, texture and geometry
Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical
Estimation of mutual information by the fuzzy histogram
TLDR
F fuzzy partitioning is suggested for the histogram-based MI estimation, which uses a general form of fuzzy membership functions, which includes the class of crisp membership functions as a special case, and it is shown that the average absolute error of the fuzzy-histogram method is less than that of the naïve histogram method.
Distance Covariance Analysis
TLDR
A dimensionality reduction method to identify linear projections that capture interactions between two or more sets of variables that can detect both linear and nonlinear relationships, and can take dependent variables into account is proposed.
Direct Approximation of Divergences Between Probability Distributions
TLDR
This chapter reviews recent advances in direct divergence approximation that follow the general inference principle advocated by Vladimir Vapnik—one should not solve a more general problem as an intermediate step when approximating a divergence.
Divergence estimation for machine learning and signal processing
  • M. Sugiyama
  • Computer Science
    2013 International Winter Workshop on Brain-Computer Interface (BCI)
  • 2013
TLDR
This talk reviews recent advances in direct divergence approximation that follow the general inference principle advocated by Vladimir Vapnik and argues that the latter approximators are more useful in practice due to their computational efficiency, high numerical stability, and superior robustness against outliers.
...
1
2
3
...

References

SHOWING 1-10 OF 76 REFERENCES
Canonical Correlation Analysis for Multilabel Classification: A Least-Squares Formulation, Extensions, and Analysis
TLDR
It is shown that under a mild condition which tends to hold for high-dimensional data, CCA in the multilabel case can be formulated as a least-squares problem, and several CCA extensions are proposed, including the sparse CCA formulation based on the 1-norm regularization.
Canonical correlation analysis using within-class coupling
Multi-Label Prediction via Sparse Infinite CCA
TLDR
A nonparametric, fully Bayesian framework that can automatically select the number of correlation components, and effectively capture the sparsity underlying the projections is proposed.
Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation
TLDR
A novel sufficient dimension-reduction method using a squared-loss variant of mutual information as a dependency measure that is formulated as a minimum contrast estimator on parametric or nonparametric models and a natural gradient algorithm on the Grassmann manifold for sufficient subspace search.
Measuring Statistical Dependence with Hilbert-Schmidt Norms
We propose an independence criterion based on the eigen-spectrum of covariance operators in reproducing kernel Hilbert spaces (RKHSs), consisting of an empirical estimate of the Hilbert-Schmidt norm
Canonical correlation analysis based on information theory
A kernel method for canonical correlation analysis
TLDR
The effectiveness of applying kernel method to canonical correlation analysis is investigated, which shows an efficient approach to improve such a linear method.
Mutual information estimation reveals global associations between stimuli and biological processes
TLDR
A novel feature selection method called Least-Squares Mutual Information (LSMI), which computes mutual information without density estimaion, and therefore LSMI can detect nonlinear associations within a cell, allows the global organization of cellular process control.
Feature discovery under contextual supervision using mutual information
  • J. Kay
  • Mathematics
    [Proceedings 1992] IJCNN International Joint Conference on Neural Networks
  • 1992
The author considers a neural network in which the inputs may be divided into two groups, termed primary inputs and contextual inputs. The goal of the network is to discover those linear functions of
Kernel dimension reduction in regression
We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives directly from the formulation of SDR in terms of the conditional independence of the covariate X from
...
1
2
3
4
5
...