• Corpus ID: 18572538

TR 0707 A Fixed-Point Continuation Method for ` 1-Regularized Minimization with Applications to Compressed Sensing

@inproceedings{Hale2007TR0A,
  title={TR 0707 A Fixed-Point Continuation Method for ` 1-Regularized Minimization with Applications to Compressed Sensing},
  author={Elaine T. Hale and Wotao Yin and Yin Zhang},
  year={2007}
}
We consider solving minimization problems with `1-regularization: 
1-bit Compressive Sensing with an Improved Algorithm Based on Fixed-point Continuation
TLDR
1-bit compressive sensing with improved reconstruction algorithms based on the fixed-point continuation (FPC) method is investigated and the resulting FPC-AOP-l1 algorithm achieves improved robustness against noise.
A Descent Dai-Liao Projection Method for Convex Constrained Nonlinear Monotone Equations with Applications
TLDR
A descent Dai-Liao projection method to solve nonlinear monotone equations with convex constraints is proposed and it is proved that the global convergence of the method is achieved.
A Spectral Gradient Projection Method for Sparse Signal Reconstruction in Compressive Sensing
In this paper, a new spectral gradient direction is proposed to solve the l1 -regularized convex minimization problem. The spectral parameter of the proposed method is computed as a convex
A Modified Self-Adaptive Conjugate Gradient Method for Solving Convex Constrained Monotone Nonlinear Equations for Signal Recovery Problems
In this article, we propose a modified self-adaptive conjugate gradient algorithm for handling nonlinear monotone equations with the constraints being convex. Under some nice conditions, the global
Sparse signal recovery based on nonconvex entropy minimization
TLDR
Experiments on 1-dimensional sparse signal recovery and 2-dimensional real image recovery show that minimizing lp favors sparse solutions, and that it could recover sparse signals better than the convex l1 norm minimization and the nonconvex lp-norm minimization.
Efficient Algorithms for Robust Recovery of Images From Compressed Data
TLDR
More computationally efficient algorithms are proposed by following latest advances in large-scale convex optimization for nonsmooth regularization to improve robust CS and effectively solve more sophisticated extensions where the original methods simply cannot.
An Augmented Lagrangian Approach to the Constrained Optimization Formulation of Imaging Inverse Problems
TLDR
This paper proposes a new efficient algorithm to handle one class of constrained problems (often known as basis pursuit denoising) tailored to image recovery applications and shows that the proposed algorithm is a strong contender for the state-of-the-art.
Coordinate descent optimization for l 1 minimization with application to compressed sensing; a greedy algorithm
We propose a fast algorithm for solving the Basis Pursuit problem, min u $\{|u|_1\: \Au=f\}$, which has application to compressed sensing. We design an efficient method for solving the related
A Fixed Point Method for Convex Systems
We present a new fixed point technique to solve a system of convex equations in several variables. Our approach is based on two powerful algorithmic ideas: operator-splitting and steepest descent
A Fast Fixed Point Continuation Algorithm with Application to Compressed Sensing
TLDR
A fast FPC (FFPC) algorithm is proposed to accelerate the convergence speed of FPC algorithm, a developed version of convex optimization algorithm, which is an important research method for reconstruction of Compressed Sensing.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 63 REFERENCES
Gradient Projection for Sparse Reconstruction: Application to Compressed Sensing and Other Inverse Problems
TLDR
This paper proposes gradient projection algorithms for the bound-constrained quadratic programming (BCQP) formulation of these problems and test variants of this approach that select the line search parameters in different ways, including techniques based on the Barzilai-Borwein method.
A Posteriori Error Bounds for the Linearly-Constrained Variational Inequality Problem
  • J. Pang
  • Mathematics, Computer Science
    Math. Oper. Res.
  • 1987
TLDR
A posteriori error bound of an approximate solution to a linearly-constrained variational inequality problem is derived from three measures of error depending on a particular characterization of an exact solution and related to a certain class of iterative methods.
An Iterative Thresholding Algorithm for Linear Inverse Problems with a Sparsity Constraint
We consider linear inverse problems where the solution is assumed to have a sparse expansion on an arbitrary preassigned orthonormal basis. We prove that replacing the usual quadratic regularizing
Homotopy continuation for sparse signal representation
  • D. Malioutov, M. Çetin, A. Willsky
  • Mathematics, Computer Science
    Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005.
  • 2005
TLDR
This work describes a homotopy continuation-based algorithm to find and trace efficiently all solutions of basis pursuit as a function of the regularization parameter, and shows the effectiveness of this algorithm in accurately and efficiently generating entire solution paths for basis pursuit.
Proximal Thresholding Algorithm for Minimization over Orthonormal Bases
TLDR
This work proposes a versatile convex variational formulation for optimization over orthonormal bases that covers a wide range of problems, and establishes the strong convergence of a proximal thresholding algorithm to solve it.
Just relax: convex programming methods for identifying sparse signals in noise
  • J. Tropp
  • Mathematics, Computer Science
    IEEE Transactions on Information Theory
  • 2006
TLDR
A method called convex relaxation, which attempts to recover the ideal sparse signal by solving a convex program, which can be completed in polynomial time with standard scientific software.
On algorithms for solving least squares problems under an L1 penalty or an L1 constraint
TLDR
Several algorithms can be used to calculate the LASSO solution by minimising the residual sum of squares subject to a constraint (penalty) on the sum of the absolute values of the coefficient estimates.
Random Filters for Compressive Sampling and Reconstruction
TLDR
A new technique for efficiently acquiring and reconstructing signals based on convolution with a fixed FIR filter having random taps, which is sufficiently generic to summarize many types of compressible signals and generalizes to streaming and continuous-time signals.
Splitting Algorithms for the Sum of Two Nonlinear Operators
Splitting algorithms for the sum of two monotone operators.We study two splitting algorithms for (stationary and evolution) problems involving the sum of two monotone operators. These algorithms ar...
...
1
2
3
4
5
...