A unified framework for high-dimensional analysis of $M$-estimators with decomposable regularizers

@inproceedings{Negahban2009AUF,
  title={A unified framework for high-dimensional analysis of \$M\$-estimators with decomposable regularizers},
  author={Sahand N. Negahban and Pradeep Ravikumar and M. Wainwright and Bin Yu},
  booktitle={NIPS},
  year={2009}
}
High-dimensional statistical inference deals with models in which the the number of parameters p is comparable to or larger than the sample size n. Since it is usually impossible to obtain consistent procedures unless p/n → 0, a line of recent work has studied models with various types of structure (e.g., sparse vectors; block-structured matrices; low-rank matrices; Markov assumptions). In such settings, a general approach to estimation is to solve a regularized convex program (known as a… Expand
A general framework for high-dimensional estimation in the presence of incoherence
  • Yuxin Chen, S. Sanghavi
  • Mathematics
  • 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
  • 2010
FASt global convergence of gradient methods for solving regularized M-estimation
Fast global convergence of gradient methods for high-dimensional statistical recovery
Fast global convergence rates of gradient methods for high-dimensional statistical recovery
Estimation of high-dimensional low-rank matrices
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 137 REFERENCES
Estimation of (near) low-rank matrices with noise and high-dimensional scaling
Estimation of high-dimensional low-rank matrices
Noisy matrix decomposition via convex relaxation: Optimal rates in high dimensions
LASSO-TYPE RECOVERY OF SPARSE REPRESENTATIONS FOR HIGH-DIMENSIONAL DATA
Sparsistency and Rates of Convergence in Large Covariance Matrix Estimation.
Restricted Eigenvalue Properties for Correlated Gaussian Designs
Consistency of the group Lasso and multiple kernel learning
  • F. Bach
  • Computer Science, Mathematics
  • J. Mach. Learn. Res.
  • 2008
...
1
2
3
4
5
...