• Corpus ID: 10153069

Bayesian Structured Sparsity from Gaussian Fields

@article{Engelhardt2014BayesianSS,
  title={Bayesian Structured Sparsity from Gaussian Fields},
  author={Barbara E. Engelhardt and Ryan P. Adams},
  journal={arXiv: Methodology},
  year={2014}
}
Substantial research on structured sparsity has contributed to analysis of many different applications. However, there have been few Bayesian procedures among this work. Here, we develop a Bayesian model for structured sparsity that uses a Gaussian process (GP) to share parameters of the sparsity-inducing prior in proportion to feature similarity as defined by an arbitrary positive definite kernel. For linear regression, this sparsity-inducing prior on regression coefficients is a relaxation of… 

Figures from this paper

Bayesian group factor analysis with structured sparsity
TLDR
A structured Bayesian group factor analysis model is developed that extends the factor model to multiple coupled observation matrices and allows for both dense and sparse latent factors so that covariation among either all features or only a subset of features can be recovered.
Bayesian group latent factor analysis with structured sparsity
TLDR
A structured Bayesian group factor analysis model is developed that extends the factor model to multiple coupled observation matrices and allows for both dense and sparse latent factors so that covariation among either all features or only a subset of features can both be recovered.
Sparse Bayesian structure learning with dependent relevance determination priors
TLDR
This work introduces a hierarchical model for smooth, region-sparse weight vectors and tensors in a linear regression setting, where a transformed Gaussian process is added to model the dependencies between the prior variances of regression weights.
Dependent relevance determination for smooth and structured sparse regression
TLDR
This work introduces a hierarchical model for smooth, region-sparse weight vectors and tensors in a linear regression setting, where a transformed Gaussian process is added and a structured model of the prior variances of Fourier coefficients is combined, which eliminates unnecessary high frequencies.
Bayesian Sparsity for Intractable Distributions
TLDR
Fadeout is introduced, an approach for variational inference that uses noncentered parameterizations to capture a posteriori correlations between parameters and hyperparameters and it is found that this framework substantially improves inferences of undirected graphical models under both sparse and group-sparse priors.
Bayesian Model Selection And Estimation Without Mcmc
TLDR
This dissertation explores Bayesian model selection and estimation in settings where the model space is too vast to rely on Markov Chain Monte Carlo for posterior calculation, and proposes an Expectation-Conditional Maximization algorithm to target a single posterior mode.
Bayesian group latent factor analysis with structured sparse priors
TLDR
The unique ability of BGFA to use multiple observations of the same samples to guide linear projection of the data onto a latent space, producing meaningful and robust low-dimensional representations, as compared with `unsupervised' projections from traditional factor analysis or principal components analysis is illustrated.
Spatio-Temporal Structured Sparse Regression With Hierarchical Gaussian Process Priors
This paper introduces a new sparse spatio-temporal structured Gaussian process regression framework for online and offline Bayesian inference. This is the first framework that gives a time-evolving
Variational Inference for Sparse and Undirected Models
TLDR
A framework for scalable Bayesian inference of discrete undirected models based on two new methods, Persistent VI and Fadeout, which substantially improve learning of sparse undirecting graphical models in simulated and real problems from physics and biology.
Bayesian Inference for Spatio-temporal Spike-and-Slab Priors
In this work, we address the problem of solving a series of underdetermined linear inverse problems subject to a sparsity constraint. We generalize the spike-and-slab prior distribution to encode a
...
...

References

SHOWING 1-10 OF 83 REFERENCES
Bayesian shrinkage
TLDR
It is demonstrated that most used shrinkage priors, including the Bayesian Lasso, are suboptimal in high-dimensional set tings, and a new class of Dirichlet Laplace (DL) priors are proposed, which are optimal and lead to effici nt posterior computation exploiting results from normalized random measure theory.
Bayesian and L1 Approaches to Sparse Unsupervised Learning
TLDR
The need to re-assess the wide use of L1 methods in sparsity-reliant applications is highlighted, particularly when the authors care about generalising to previously unseen data, and an alternative that, over many varying conditions, provides improved generalisation performance is provided.
Variational Bayesian Multinomial Probit Regression with Gaussian Process Priors
TLDR
This is the first time that a fully variational Bayesian treatment for multiclass GP classification has been developed without having to resort to additional explicit approximations to the nongaussian likelihood term.
Smoothing proximal gradient method for general structured sparse regression
TLDR
This paper proposes a general optimization approach, the smoothing proximal gradient method, which can solve structured sparse regression problems with any smooth convex loss under a wide spectrum of structured sparsity-inducing penalties.
Shrink Globally, Act Locally: Sparse Bayesian Regularization and Prediction
We study the classic problem of choosing a prior distribution for a location parameter β = (β1, . . . , βp) as p grows large. First, we study the standard “global-local shrinkage” approach, based on
Spike and slab variable selection: Frequentist and Bayesian strategies
TLDR
This paper introduces a variable selection method referred to as a rescaled spike and slab model, and studies the usefulness of continuous bimodal priors to model hypervariance parameters, and the effect scaling has on the posterior mean through its relationship to penalization.
Bayesian Models for Sparse Regression Analysis of High Dimensional Data
This paper considers the task of building efficient regression models for sparse multivariate analysis of high dimensional data sets, in particular it focuses on cases where the numbers q of
Generalized Beta Mixtures of Gaussians
TLDR
A new class of normal scale mixtures is proposed through a novel generalized beta distribution that encompasses many interesting priors as special cases and develops a class of variational Bayes approximations that will scale more efficiently to the types of truly massive data sets that are now encountered routinely.
GENERALIZED DOUBLE PARETO SHRINKAGE.
TLDR
The properties of the maximum a posteriori estimator are investigated, as sparse estimation plays an important role in many problems, connections with some well-established regularization procedures are revealed, and some asymptotic results are shown.
Proximal Methods for Hierarchical Sparse Coding
TLDR
The procedure has a complexity linear, or close to linear, in the number of atoms, and allows the use of accelerated gradient techniques to solve the tree-structured sparse approximation problem at the same computational cost as traditional ones using the l1-norm.
...
...