# Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models

@inproceedings{Seeger2009LargeSV, title={Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models}, author={Matthias W. Seeger and Hannes Nickisch}, booktitle={Sampling-based Optimization in the Presence of Uncertainty}, year={2009} }

Sparsity is a fundamental concept of modern statistics, and often the only general principle available at the moment to address novel learning applications with many more variables than observations. While much progress has been made recently in the theoretical understanding and algorithmics of sparse point estimation, higher-order problems such as covariance estimation or optimal data acquisition are seldomly addressed for sparsity-favouring models, and there are virtually no algorithms for…

## 22 Citations

Sparse linear models: Variational approximate inference and Bayesian experimental design

- Computer Science
- 2009

It is argued that many problems in practice, such as compressive sensing for real-world image reconstruction, are served much better by proper uncertainty approximations than by ever more aggressive sparse estimation algorithms.

Variational approximate inference in latent linear models

- Computer Science
- 2013

This thesis considers deterministic approximate inference methods based on minimising the Kullback-Leibler (KL) divergence between a given target density and an approximating `variational' density and presents a new method to perform KL variational inference for a broad class of approximating variational densities.

Bayesian inference and experimental design for large generalised linear models

- Computer Science
- 2010

This thesis derives, studies and applies deterministic approximate inference and experimental design algorithms with a focus on the class of generalised linear models (GLMs), with special emphasis on algorithmic properties such as convexity, numerical stability, and scalability to large numbers of interacting variables.

Concave Gaussian Variational Approximations for Inference in Large-Scale Bayesian Linear Models

- Computer ScienceAISTATS
- 2011

For a large class of models, the local variational method is equivalent to a weakened form of Kullback-Leibler Gaussian approximation, showing that it is a concave function in the parameters of the Gaussian for log concave sites.

Gaussian Kullback-Leibler approximate inference

- Computer ScienceJ. Mach. Learn. Res.
- 2013

Numerical results comparing G-KL and other deterministic Gaussian approximate inference methods are presented for: robust Gaussian process regression models with either Student-t or Laplace likelihoods, large scale Bayesian binary logistic regression models, and Bayesian sparse linear models for sequential experimental design.

Variational Bayesian Inference Techniques

- Computer ScienceIEEE Signal Processing Magazine
- 2010

Novel variational relaxations of Bayesian integration are described, characterized as well as posterior maximization, which can be solved robustly for very large models by algorithms unifying convex reconstruction and Bayesian graphical model technology.

Gaussian sampling by local perturbations

- Computer ScienceNIPS
- 2010

A technique for exact simulation of Gaussian Markov random fields (GMRFs), which can be interpreted as locally injecting noise to each Gaussian factor independently, followed by computing the mean/mode of the perturbed GMRF, which leads to an efficient unbiased estimator of marginal variances.

glm-ie: Generalised Linear Models Inference & Estimation Toolbox

- Computer ScienceJ. Mach. Learn. Res.
- 2012

This work designed the glm-ie package to be simple, generic and easily expansible, and provides a wide choice of penalty functions for estimation, potential functions for inference and matrix classes with lazy evaluation for convenient modelling.

Causal and Probabilistic Inference Empirical and Theoretical Analysis of Bayesian Inference in Gaussian Process Models

- Computer Science
- 2009

The Gaussian process (GP) predictor is a popular kernel method allowing to express uncertainty about smooth functions [5]. In Bayesian inference, a GP prior is combined with the data to yield the…

Regularization Strategies and Empirical Bayesian Learning for MKL

- Computer ScienceArXiv
- 2010

This paper shows how different MKL algorithms can be understood as applications of either regularization on the kernel weights or block-norm-based regularization, which is more common in structured sparsity and multi-task learning.

## References

SHOWING 1-10 OF 78 REFERENCES

Sparse linear models: Variational approximate inference and Bayesian experimental design

- Computer Science
- 2009

It is argued that many problems in practice, such as compressive sensing for real-world image reconstruction, are served much better by proper uncertainty approximations than by ever more aggressive sparse estimation algorithms.

Convex variational Bayesian inference for large scale generalized linear models

- Computer ScienceICML '09
- 2009

We show how variational Bayesian inference can be implemented for very large generalized linear models. Our relaxation is proven to be a convex problem for any log-concave model. We provide a generic…

Latent Variable Bayesian Models for Promoting Sparsity

- MathematicsIEEE Transactions on Information Theory
- 2011

In coefficient space, the analysis reveals that Type II is exactly equivalent to performing standard MAP estimation using a particular class of dictionary- and noise-dependent, nonfactorial coefficient priors.

Gaussian Covariance and Scalable Variational Inference

- Computer ScienceICML
- 2010

This work provides theoretical and empirical insights into algorithmic and statistical consequences of low-rank covariance approximation errors on decision outcomes in nonlinear sequential Bayesian experimental design.

Graphical Models, Exponential Families, and Variational Inference

- Computer ScienceFound. Trends Mach. Learn.
- 2008

The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.

Variational methods for inference and estimation in graphical models

- Computer Science
- 1997

This thesis proposes a principled framework for approximating graphical models based on variational methods and develops variational techniques from the perspective that unifies and expands their applicability to graphical models.

Sparse reconstruction by separable approximation

- Computer Science, MathematicsIEEE Trans. Signal Process.
- 2009

This work proposes iterative methods in which each step is obtained by solving an optimization subproblem involving a quadratic term with diagonal Hessian plus the original sparsity-inducing regularizer, and proves convergence of the proposed iterative algorithm to a minimum of the objective function.

A Variational Baysian Framework for Graphical Models

- Computer ScienceNIPS
- 1999

This paper presents a novel practical framework for Bayesian model averaging and model selection in probabilistic graphical models. Our approach approximates full posterior distributions over model…

Propagation Algorithms for Variational Bayesian Learning

- Computer ScienceNIPS
- 2000

It is demonstrated how the belief propagation and the junction tree algorithms can be used in the inference step of variational Bayesian learning to infer the hidden state dimensionality of the state-space model in a variety of synthetic problems and one real high-dimensional data set.

Bayesian Inference for Spiking Neuron Models with a Sparsity Prior

- Biology, Computer ScienceNIPS
- 2007

Using the expectation propagation algorithm, the Bayesian treatment of generalized linear models is presented, able to approximate the full posterior distribution over all weights, and the sparsity of the Laplace prior is used to select those filters from a spike-triggered covariance analysis that are most informative about the neural response.