• Corpus ID: 239998177

Constrained Optimization Involving Nonconvex 𝓁p Norms: Optimality Conditions, Algorithm and Convergence

  title={Constrained Optimization Involving Nonconvex 𝓁p Norms: Optimality Conditions, Algorithm and Convergence},
  author={Hao Wang and Yining Gao and Jiashan Wang and Hongying Liu},
This paper investigates the optimality conditions for characterizing the local minimizers of the constrained optimization problems involving an `p norm (0 < p < 1) of the variables, which may appear in either the objective or the constraint. This kind of problems have strong applicability to a wide range of areas since usually the `p norm can promote sparse solutions. However, the nonsmooth and non-Lipschtiz nature of the `p norm often cause these problems difficult to analyze and solve. We… 
1 Citations

Figures from this paper

Towards an efficient approach for the nonconvex $\ell_p$-ball projection: algorithm and analysis
A novel numerical approach for computing the stationary point through solving a sequence of projections onto the reweighted `1-balls is developed, which is practically simple to implement and computationally efficient.


Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary
This paper presents a new set of first- and second-order necessary conditions that are derived without the use of subdifferential and reduce to exactly the KKT condition when (twice-)differentiability holds, and indicates that best known existing complexity bounds are actually held for a wider class of nonlinear programming problems.
Optimality and Complexity for Constrained Optimization Problems with Nonconvex Regularization
A class of constrained optimization problems where the feasible set is a general closed convex set, and the objective function has a nonsmooth, nonconvex regularizer is considered, which includes widely used SCAD, MCP, logistic, fraction, hard thresholding, and non-Lipschitz Lp penalties as special cases.
Nonconvex and Nonsmooth Sparse Optimization via Adaptively Iterative Reweighted Methods
This work designs a general algorithmic framework of adaptively iterative reweighted algorithms for solving the nonconvex and nonsmooth sparse optimization problems by solving a sequence of weighted convex penalty subproblems with adaptively updated weights.
Optimality conditions for the constrained Lp-regularization
The -regularization problem with is a nonsmooth and nonconvex problem and has remarkable advantages in the restoration of discrete signals and images. The constrained -regularization problem can
Lower Bound Theory of Nonzero Entries in Solutions of ℓ2-ℓp Minimization
This paper establishes lower bounds for the absolute value of nonzero entries in every local optimal solution of the model, which can be used to indentify zero entries precisely in any numerical solution, and develops a lower bound theorem to classify zero and non zero entries inevery local solution.
Sharp Time–Data Tradeoffs for Linear Inverse Problems
The results demonstrate that a linear convergence rate is attainable even though the least squares objective is not strongly convex in these settings, and present a unified convergence analysis of the gradient projection algorithm applied to such problems.
A Cone-Continuity Constraint Qualification and Algorithmic Consequences
A cone-continuity property (CCP) is defined that will be shown to be the weakest possible constraint qualification (SCQ) and its relation to other constraint qualifications will be clarified.
Relating lp regularization and reweighted l1 regularization
It is proved that after some iteration k, the iterates generated by the proposed iteratively reweighted l1 methods have the same support and sign as the limit points, and are bounded away from 0, so that the algorithm behaves like solving a smooth problem in the reduced space.
Exact Reconstruction of Sparse Signals via Nonconvex Minimization
  • R. Chartrand
  • Mathematics, Computer Science
    IEEE Signal Processing Letters
  • 2007
It is shown that by replacing the lscr1 norm with theLscrp norm, exact reconstruction is possible with substantially fewer measurements, and a theorem in this direction is given.
Sparse Portfolio Selection via Quasi-Norm Regularization
In this paper, we propose $\ell_p$-norm regularized models to seek near-optimal sparse portfolios. These sparse solutions reduce the complexity of portfolio implementation and management. Theoretical