# Dimensionality Reduction with Generalized Linear Models

@inproceedings{Chen2013DimensionalityRW, title={Dimensionality Reduction with Generalized Linear Models}, author={Mo Chen and Wei Li and Xiaogang Wang and Wayne Zhang}, booktitle={International Joint Conference on Artificial Intelligence}, year={2013} }

In this paper, we propose a general dimensionality reduction method for data generated from a very broad family of distributions and nonlinear functions based on the generalized linear model, called Generalized Linear Principal Component Analysis (GLPCA). Data of different domains often have very different structures. These data can be modeled by different distributions and reconstruction functions. For example, real valued data can be modeled by the Gaussian distribution with a linear…

## 4 Citations

### Deviance Matrix Factorization

- Computer ScienceArXiv
- 2021

The theoretical and empirical results indicate that the proposed decomposition is more flexible, general, and robust, and can thus provide improved performance when compared to similar methods.

### IKLTSA: An Incremental Kernel LTSA Method

- Computer ScienceMLDM
- 2015

An incremental version of the manifold learning algorithm LTSA based on kernel method, which is called IKLSTA, the abbreviation of Incremental Kernel LTSA, which exploits the advantages of kernel method and can detect the explicit mapping from the high-dimensional data points to their low-dimensional embedding coordinates.

### Deep Descriptor Transforming for Image Co-Localization

- Computer ScienceIJCAI
- 2017

This paper proposes a simple but effective method, named Deep Descriptor Transforming (DDT), for evaluating the correlations of descriptors and then obtaining the category-consistent regions, which can accurately locate the common object in a set of images.

### Benchmarking principal component analysis for large-scale single-cell RNA-sequencing

- Computer SciencebioRxiv
- 2019

A guideline is developed to select an appropriate PCA implementation based on the differences in the computational environment of users and developers to show that some PCA algorithms based on Krylov subspace and randomized singular value decomposition are fast, memory-efficient, and more accurate than the other algorithms.

## 19 References

### A Generalized Linear Model for Principal Component Analysis of Binary Data

- Computer ScienceAISTATS
- 2003

An alternating least squares method is derived to estimate the basis vectors and generalized linear coefficients of the logistic PCA model, a generalized linear model for dimensionality reduction of binary data that is related to principal component analysis (PCA) and is much better suited to modeling binary data than conventional PCA.

### A Generalization of Principal Components Analysis to the Exponential Family

- Computer ScienceNIPS
- 2001

This paper draws on ideas from the Exponential family, Generalized linear models, and Bregman distances to give a generalization of PCA to loss functions that it is argued are better suited to other data types.

### Graphical Models, Exponential Families, and Variational Inference

- Computer ScienceFound. Trends Mach. Learn.
- 2008

The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.

### Generalized^2 Linear^2 Models

- Mathematics, Computer ScienceNIPS 2002
- 2002

The Generalized2 Linear2 Model is introduced, a statistical estimator which combines features of nonlinear regression and factor analysis and an iterative procedure which optimizes the parameters of a (GL)2M is presented.

### Closed-form supervised dimensionality reduction with generalized linear models

- Computer ScienceICML '08
- 2008

We propose a family of supervised dimensionality reduction (SDR) algorithms that combine feature extraction (dimensionality reduction) with learning a predictive model in a unified optimization…

### Relative Loss Bounds for Multidimensional Regression Problems

- Computer ScienceMachine Learning
- 2004

A unified treatment of on-line generalized linear regression with multidimensional outputs, i.e., neural networks with multiple output nodes but no hidden nodes, is studied, which generalizes earlier results for the gradient descent and exponentiated gradient algorithms to multiddimensional outputs, including multiclass logistic regression.

### Relative Loss Bounds for On-Line Density Estimation with the Exponential Family of Distributions

- Computer ScienceMachine Learning
- 2004

This work considers on-line density estimation with a parameterized density from the exponential family and uses a Bregman divergence to derive and analyze each algorithm to design algorithms with the best possible relative loss bounds.

### EM Algorithms for PCA and SPCA

- Computer ScienceNIPS
- 1997

An expectation-maximization (EM) algorithm for principal component analysis (PCA) which allows a few eigenvectors and eigenvalues to be extracted from large collections of high dimensional data and defines a proper density model in the data space.

### An Analysis of Transformations

- Mathematics
- 1964

[Read at a RESEARCH METHODS MEETING of the SOCIETY, April 8th, 1964, Professor D. V. LINDLEY in the Chair] SUMMARY In the analysis of data it is often assumed that observations Yl, Y2, *-, Yn are…

### Natural Gradient Works Efficiently in Learning

- Computer ScienceNeural Computation
- 1998

The dynamical behavior of natural gradient online learning is analyzed and is proved to be Fisher efficient, implying that it has asymptotically the same performance as the optimal batch estimation of parameters.