• Corpus ID: 124722254

Semidefinite Optimization with Applications in Sparse Multivariate Statistics

@inproceedings{dAspremont2007SemidefiniteOW,
  title={Semidefinite Optimization with Applications in Sparse Multivariate Statistics},
  author={Alexandre d'Aspremont},
  year={2007}
}

Topics from this paper

Nonnegative matrix factorization : complexity, algorithms and applications
Linear dimensionality reduction techniques such as principal component analysis are powerful tools for the analysis of high-dimensional data. In this thesis, we explore a closely related problem,
Wideband waveform optimization for energy detector receiver with practical considerations
This paper deals with waveform optimization problems raised from advanced radio system prototyping conducted recently. Motivated by increasing demands for wireless sensor networks, simple receivers

References

SHOWING 1-10 OF 31 REFERENCES
PROX-METHOD WITH RATE OF CONVERGENCE O(1/t) FOR VARIATIONAL INEQUALITIES WITH LIPSCHITZ CONTINUOUS MONOTONE OPERATORS AND SMOOTH CONVEX-CONCAVE SADDLE POINT PROBLEMS∗
We propose a prox-type method with efficiency estimate O( −1) for approximating saddle points of convex-concave C1,1 functions and solutions of variational inequalities with monotone Lipschitz
Covariance selection
  • Biometrics
  • 1972
Maximum likelihood estimation of Gaussian graphical models : Numerical implementation and topology selection
We describe algorithms for maximum likelihood estimation of Gaussian graphical models with conditional independence constraints. It is well-known that this problem can be formulated as an
Sparse Principal Component Analysis
Principal component analysis (PCA) is widely used in data processing and dimensionality reduction. However, PCA suffers from the fact that each principal component is a linear combination of all the
Decoding by linear programming
  • E. Candès, T. Tao
  • Computer Science, Mathematics
    IEEE Transactions on Information Theory
  • 2005
TLDR
F can be recovered exactly by solving a simple convex optimization problem (which one can recast as a linear program) and numerical experiments suggest that this recovery procedure works unreasonably well; f is recovered exactly even in situations where a significant fraction of the output is corrupted.
Smooth Optimization for Sparse Semidefinite Programs
Smooth minimization of non-smooth functions
  • Y. Nesterov
  • Mathematics, Computer Science
    Math. Program.
  • 2005
TLDR
A new approach for constructing efficient schemes for non-smooth convex optimization is proposed, based on a special smoothing technique, which can be applied to functions with explicit max-structure, and can be considered as an alternative to black-box minimization.
Smooth minimization of nonsmooth functions
  • Mathematical Programming,
  • 2005
Sparse nonnegative solution of underdetermined linear equations by linear programming.
  • D. Donoho, J. Tanner
  • Mathematics, Medicine
    Proceedings of the National Academy of Sciences of the United States of America
  • 2005
TLDR
It is shown that outward k-neighborliness is equivalent to the statement that, whenever y = Ax has a non negative solution with at most k nonzeros, it is the nonnegative solution to y =Ax having minimal sum.
Bayesian Covariance Selection ∗
We present a novel structural learning method called HdBCS that performs covariance selection in a Bayesian framework for datasets with tens of thousands of variables. HdBCS is based on the intrinsic
...
1
2
3
4
...