• Corpus ID: 237372175

Half-Space and Box Constraints as NUV Priors: First Results

  title={Half-Space and Box Constraints as NUV Priors: First Results},
  author={Raphael Keusch and Hans-Andrea Loeliger},
Normals with unknown variance (NUV) can represent many useful priors and blend well with Gaussian models and message passing algorithms. NUV representations of sparsifying priors have long been known, and NUV representations of binary (and M -level) priors have been proposed very recently. In this document, we propose NUV representations of half-space constraints and box constraints, which allows to add such constraints to any linear Gaussian model with any of the previously known NUV priors… 


Factor Graphs with NUV Priors and Iteratively Reweighted Descent for Sparse Least Squares and More
A smoothed-NUV representation of the Huber function and of a related nonconvex cost function is proposed, and their use for sparse least-squares with outliers and in a natural (piecewise smooth) prior for imaging is illustrated.
On sparsity by NUV-EM, Gaussian message passing, and Kalman smoothing
Improved tables of Gaussian-message computations from whichGaussian message passing algorithms that are closely related to Kalman smoothing are given, and two preferred such algorithms are pointed out.
Optimization with Sparsity-Inducing Penalties
This monograph covers proximal methods, block-coordinate descent, reweighted l2-penalized techniques, working-set and homotopy methods, as well as non-convex formulations and extensions, and provides an extensive set of experiments to compare various algorithms from a computational point of view.
Sparse Bayesian learning for basis selection
  • D. Wipf, B. Rao
  • Computer Science, Mathematics
    IEEE Transactions on Signal Processing
  • 2004
This paper adapts SBL to the signal processing problem of basis selection from overcomplete dictionaries, proving several results about the SBL cost function that elucidate its general behavior and providing solid theoretical justification for this application.
Fast Marginal Likelihood Maximisation for Sparse Bayesian Models
This work describes a new and highly accelerated algorithm which exploits recently-elucidated properties of the marginal likelihood function to enable maximisation via a principled and efficient sequential addition and deletion of candidate basis functions.
Bayesian Interpolation
  • D. Mackay
  • Computer Science
    Neural Computation
  • 1992
The Bayesian approach to regularization and model-comparison is demonstrated by studying the inference problem of interpolating noisy data by examining the posterior probability distribution of regularizing constants and noise levels.
A New View of Automatic Relevance Determination
This paper furnishes an alternative means of expressing the ARD cost function using auxiliary functions that naturally addresses both of these issues and suggest alternative cost functions and update procedures for selecting features and promoting sparse solutions in a variety of general situations.
Binary Control and Digital-to-Analog Conversion Using Composite NUV Priors and Iterative Gaussian Message Passing
A new method to determine a binary control signal for an analog linear system such that the state, or some output, of the system follows a given target trajectory and can also be used for digital-to-analog conversion.
An introduction to factor graphs
  • H. Loeliger
  • Computer Science
    IEEE Signal Processing Magazine
  • 2004
This work uses Forney-style factor graphs, which support hierarchical modeling and are compatible with standard block diagrams, and uses them to derive practical detection/estimation algorithms in a wide area of applications.
Lp normed minimization with applications to linear predictive modeling for sinusoidal frequency estimation
The robustness of an L p normed solution when estimating the frequency of sinusoids from data contaminated by impulsive noise is demonstrated and insight is gained into the transient and steady behavior of the iterative algorithm.