• Corpus ID: 202595202

Title Canonical dependency analysis based on squared-loss mutualinformation

  title={Title Canonical dependency analysis based on squared-loss mutualinformation},
  author={Masayuki Karasuyama and Masashi Sugiyama},
Canonical correlation analysis (CCA) is a classical dimensionality reduction technique for two sets of variables that iteratively finds projection directions with maximum correlation. Although CCA is still in vital use in many practical application areas, recent real-world data often contain more complicated non-linear correlations that can not be properly captured by classical CCA. In this paper, we thus propose an extension of CCA that can effectively capture such complicated non-linear… 

Figures from this paper


Canonical Correlation Analysis for Multilabel Classification: A Least-Squares Formulation, Extensions, and Analysis
It is shown that under a mild condition which tends to hold for high-dimensional data, CCA in the multilabel case can be formulated as a least-squares problem, and several CCA extensions are proposed, including the sparse CCA formulation based on the 1-norm regularization.
Multi-Label Prediction via Sparse Infinite CCA
A nonparametric, fully Bayesian framework that can automatically select the number of correlation components, and effectively capture the sparsity underlying the projections is proposed.
Canonical correlation analysis using within-class coupling
Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation
A novel sufficient dimension-reduction method using a squared-loss variant of mutual information as a dependency measure that is formulated as a minimum contrast estimator on parametric or nonparametric models and a natural gradient algorithm on the Grassmann manifold for sufficient subspace search.
Measuring Statistical Dependence with Hilbert-Schmidt Norms
We propose an independence criterion based on the eigen-spectrum of covariance operators in reproducing kernel Hilbert spaces (RKHSs), consisting of an empirical estimate of the Hilbert-Schmidt norm
Canonical correlation analysis based on information theory
Kernel Canonical Correlation Analysis1
A new non-linear feature extraction technique based on Canonical Correlation Analysis (CCA), which is especially well suited for relating two sets of measurements and compared to standard feature extraction methods based on PCA.
Kernel dimension reduction in regression
We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives directly from the formulation of SDR in terms of the conditional independence of the covariate X from
Feature discovery under contextual supervision using mutual information
  • J. Kay
  • Mathematics
    [Proceedings 1992] IJCNN International Joint Conference on Neural Networks
  • 1992
The author considers a neural network in which the inputs may be divided into two groups, termed primary inputs and contextual inputs. The goal of the network is to discover those linear functions of
A Least-squares Approach to Direct Importance Estimation
This paper proposes a new importance estimation method that has a closed-form solution; the leave-one-out cross-validation score can also be computed analytically and is computationally highly efficient and simple to implement.