• Corpus ID: 244714651

A dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problems

@article{Wang2021ADS,
  title={A dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problems},
  author={Chengjing Wang and Peipei Tang},
  journal={ArXiv},
  year={2021},
  volume={abs/2111.13878}
}
Square-root Lasso problems are proven robust regression problems. Furthermore, square-root regression problems with structured sparsity also plays an important role in statistics and machine learning. In this paper, we focus on the numerical computation of large-scale linearly constrained sparse group square-root Lasso problems. In order to overcome the difficulty that there are two nonsmooth terms in the objective function, we propose a dual semismooth Newton (SSN) based augmented Lagrangian… 

Tables from this paper

References

SHOWING 1-10 OF 43 REFERENCES
A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
TLDR
Under very mild conditions, which hold automatically for lasso problems, both the primal and the dual iteration sequences generated by Ssnal possess a fast linear convergence rate, which can even be superlinear asymptotically.
An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
TLDR
An efficient augmented Lagrangian method for large-scale non-overlapping sparse group Lasso problems with each subproblem being solved by a superlinearly convergent inexact semismooth Newton method.
On Regularized Square-root Regression Problems: Distributionally Robust Interpretation and Fast Computations
TLDR
A unified proof is given to show that any square-root regularized model whose penalty function being the sum of a simple norm and a seminorm can be interpreted as the distributionally robust optimization (DRO) formulation of the corresponding least-squares problem.
Sharp Oracle Inequalities for Square Root Regularization
TLDR
A set of regularization methods for high-dimensional linear regression models that have the square root of the residual sum of squared errors as loss function, and any weakly decomposable norm as penalty function are studied.
Square-Root Lasso: Pivotal Recovery of Sparse Signals via Conic Programming
TLDR
The square-root LASSO is formulated as a solution to a convex conic programming problem, which allows us to use efficient computational methods, such as interior point methods, to implement the estimator.
On the Linear Convergence of a Proximal Gradient Method for a Class of Nonsmooth Convex Minimization Problems
TLDR
A local error bound is proved around the optimal solution set for this problem and used to establish the linear convergence of the PGM method without assuming strong convexity of the overall objective function.
The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms
TLDR
The group square-root lasso (GSRL) method for estimation in high dimensional sparse regression models with group structure is introduced, and it is shown that the GSRL estimator adapts to the unknown sparsity of the regression vector, and has the same optimal estimation and prediction accuracy as the GL estimators, under the same minimal conditions on the model.
Generalized Hessian matrix and second-order optimality conditions for problems withC1,1 data
TLDR
This paper presents a generalization of the Hessian matrix to C1,1 functions, i.e., to functions whose gradient mapping is locally Lipschitz, and derives a second-order Taylor expansion of aC1, 1 function.
A note on the group lasso and a sparse group lasso
TLDR
An ecien t algorithm is derived for the resulting convex problem based on coordinate descent that can be used to solve the general form of the group lasso, with non-orthonormal model matrices.
Penalized and Constrained Regression
TLDR
This work develops the Penalized and Constrained regression method (PAC), an extremely general method for computing the penalized coefficient paths on high-dimensional GLM fits, subject to a set of linear constraints.
...
1
2
3
4
5
...