I-LAMM FOR SPARSE LEARNING: SIMULTANEOUS CONTROL OF ALGORITHMIC COMPLEXITY AND STATISTICAL ERROR.

@article{Fan2018ILAMMFS,
  title={I-LAMM FOR SPARSE LEARNING: SIMULTANEOUS CONTROL OF ALGORITHMIC COMPLEXITY AND STATISTICAL ERROR.},
  author={Jianqing Fan and Han Liu and Q. Sun and T. Zhang},
  journal={Annals of statistics},
  year={2018},
  volume={46 2},
  pages={
          814-841
        }
}
We propose a computational framework named iterative local adaptive majorize-minimization (I-LAMM) to simultaneously control algorithmic complexity and statistical error when fitting high dimensional models. I-LAMM is a two-stage algorithmic implementation of the local linear approximation to a family of folded concave penalized quasi-likelihood. The first stage solves a convex program with a crude precision tolerance to obtain a coarse initial estimator, which is further refined in the second… Expand
48 Citations
Learning Markov models via low-rank optimization
  • 2
  • PDF
Robust Sparse Reduced Rank Regression in High Dimensions
  • 3
Iteratively Reweighted `1-Penalized Robust Regression
  • Highly Influenced
  • PDF
On Quadratic Convergence of DC Proximal Newton Algorithm for Nonconvex Sparse Learning in High Dimensions
  • 2
  • Highly Influenced
  • PDF
Nonconvex Sparse Graph Learning under Laplacian Constrained Graphical Model
  • 2
  • PDF
On Quadratic Convergence of DC Proximal Newton Algorithm in Nonconvex Sparse Learning
  • 7
  • PDF
Statistical Optimization for High Dimensional Machine Learning with Privacy and Sparsity Constraints
  • J. Ge
  • Computer Science
  • 2019
  • Highly Influenced
  • PDF
Distributionally Robust Reduced Rank Regression and Principal Component Analysis in High Dimensions
  • 2
  • PDF
Does the 𝓁1-norm Learn a Sparse Graph under Laplacian Constrained Graphical Models?
  • 3
  • PDF
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 40 REFERENCES
OPTIMAL COMPUTATIONAL AND STATISTICAL RATES OF CONVERGENCE FOR SPARSE NONCONVEX LEARNING PROBLEMS.
  • 137
  • PDF
Fast global convergence rates of gradient methods for high-dimensional statistical recovery
  • 122
STRONG ORACLE OPTIMALITY OF FOLDED CONCAVE PENALIZED ESTIMATION.
  • 202
  • PDF
Regularized M-estimators with nonconvexity: statistical and algorithmic theory for local optima
  • 323
  • Highly Influential
  • PDF
Analysis of Multi-stage Convex Relaxation for Sparse Regularization
  • Tong Zhang
  • Mathematics, Computer Science
  • J. Mach. Learn. Res.
  • 2010
  • 395
  • PDF
CALIBRATING NON-CONVEX PENALIZED REGRESSION IN ULTRA-HIGH DIMENSION.
  • 106
  • PDF
PATHWISE COORDINATE OPTIMIZATION
  • 1,791
  • PDF
Statistical consistency and asymptotic normality for high-dimensional robust M-estimators
  • 107
  • PDF
...
1
2
3
4
...