# A new smoothing modified three-term conjugate gradient method for l1$l_{1}$-norm minimization problem

@article{Du2018ANS,
title={A new smoothing modified three-term conjugate gradient method for l1\$l\_\{1\}\$-norm minimization problem},
author={Shou-qiang Du and Miao Chen},
journal={Journal of Inequalities and Applications},
year={2018},
volume={2018},
pages={1-14}
}
• Published 2018
• Mathematics, Computer Science
• Journal of Inequalities and Applications
We consider a kind of nonsmooth optimization problems with l1$l_{1}$-norm minimization, which has many applications in compressed sensing, signal reconstruction, and the related engineering problems. Using smoothing approximate techniques, this kind of nonsmooth optimization problem can be transformed into a general unconstrained optimization problem, which can be solved by the proposed smoothing modified three-term conjugate gradient method. The smoothing modified three-term conjugate gradient…
4 Citations

### A Modified Nonlinear Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization

• Computer Science, Mathematics
J. Optim. Theory Appl.
• 2020
A modified nonlinear conjugate gradient method is proposed, which achieves the global convergence property and numerical efficiency, and has a very good performance in solving large-scale nonsmooth problems.

### Modified Three-Term Conjugate Gradient Method and Its Applications

• Mathematics
Mathematical Problems in Engineering
• 2019
The global convergence property of the proposed three-term conjugate gradient method with the Armijo line search is proved and it is shown that the proposed method possesses the sufficient descent property.

### Unconstrained Optimization Methods: Conjugate Gradient Methods and Trust-Region Methods

Two important classes of unconstrained optimization methods: conjugate gradient methods and trust region methods are considered; it seems that they are never out of date.

### A NEW NON-CONVEX APPROACH FOR COMPRESSIVE SENSING MRI

• Computer Science
• 2020
A novel approach, dubbed as regularized maximum entropy function (RMEF) minimization algorithm, to approximate Lq-norm (0 < q < 1) as sparsity promoting objectives, and then the regularization mechanism for improving the de-noising performance is adopted.

## References

SHOWING 1-10 OF 34 REFERENCES

### The Smoothing FR Conjugate Gradient Method for Solving a Kind of Nonsmooth Optimization Problem with -Norm

• Computer Science, Mathematics
• 2018
This work studies the method for solving a kind of nonsmooth optimization problems with -norm, which is widely used in the problem of compressed sensing, image processing, and some related optimization problems, and shows the effectiveness of the given smoothing FR conjugate gradient method.

### A New Modified Three-Term Conjugate Gradient Method with Sufficient Descent Property and Its Global Convergence

• Mathematics
• 2017
A new modified three-term conjugate gradient (CG) method is shown for solving the large scale optimization problems. The idea relates to the famous Polak-Ribiere-Polyak (PRP) formula. As the

### Smoothing methods for nonsmooth, nonconvex minimization

We consider a class of smoothing methods for minimization problems where the feasible set is convex but the objective function is not convex, not differentiable and perhaps not even locally Lipschitz

### An Interior-Point Method for Large-Scale $\ell_1$-Regularized Least Squares

• Computer Science
IEEE Journal of Selected Topics in Signal Processing
• 2007
A specialized interior-point method for solving large-scale -regularized LSPs that uses the preconditioned conjugate gradients algorithm to compute the search direction and can solve large sparse problems, with a million variables and observations, in a few tens of minutes on a PC.

### Gradient Projection for Sparse Reconstruction: Application to Compressed Sensing and Other Inverse Problems

• Computer Science
IEEE Journal of Selected Topics in Signal Processing
• 2007
This paper proposes gradient projection algorithms for the bound-constrained quadratic programming (BCQP) formulation of these problems and test variants of this approach that select the line search parameters in different ways, including techniques based on the Barzilai-Borwein method.

### Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property

• Computer Science
• 2015
These modified Polak-Ribière-Polyak (PRP) conjugate gradient methods for unconstrained optimization possess the sufficient descent property without any line searches and converge globally with a Wolfe line search.

### An Alternative Lagrange-Dual Based Algorithm for Sparse Signal Reconstruction

• Computer Science
IEEE Transactions on Signal Processing
• 2011
A new Lagrange-dual reformulation associated with an l1 -norm minimization problem for sparse signal reconstruction and the efficiency and performance of the proposed algorithm are validated via theoretical analysis as well as some illustrative numerical examples.

### A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property

• Mathematics
SIAM J. Optim.
• 1999
This paper presents a new version of the conjugate gradient method, which converges globally, provided the line search satisfies the standard Wolfe conditions.