• Publications
  • Influence
Fixed point and Bregman iterative methods for matrix rank minimization
TLDR
We propose fixed point and Bregman iterative algorithms for solving the nuclear norm minimization problem and prove convergence of the first of these algorithms. Expand
  • 891
  • 110
  • PDF
An efficient algorithm for compressed MR imaging using total variation and wavelets
TLDR
Compressed sensing, an emerging multidisciplinary field involving mathematics, probability, optimization, and signal processing, focuses on reconstructing an unknown signal from a very limited number of samples. Expand
  • 334
  • 34
  • PDF
Fast alternating linearization methods for minimizing the sum of two convex functions
TLDR
We present in this paper alternating linearization algorithms based on an alternating direction augmented Lagrangian approach for minimizing the sum of two convex functions, with little change in the computational effort required at each iteration. Expand
  • 229
  • 16
  • PDF
Sparse Inverse Covariance Selection via Alternating Linearization Methods
TLDR
Gaussian graphical models are of great interest in statistical learning. Expand
  • 163
  • 15
  • PDF
Convergence of Fixed-Point Continuation Algorithms for Matrix Rank Minimization
TLDR
In this paper, we study the convergence/recoverability properties of the fixed-point continuation algorithm and its variants for matrix rank minimization. Expand
  • 135
  • 15
  • PDF
Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization
TLDR
We propose a general framework of stochastic quasi-Newton methods for solving nonconvex optimization, where we assume that only stochastically information of the gradients of the objective function is available via a first-order oracle (SFO). Expand
  • 88
  • 15
  • PDF
Barzilai-Borwein Step Size for Stochastic Gradient Descent
TLDR
We propose to use the Barzilai-Borwein method to automatically compute step sizes for SGD and its variant: stochastic variance reduced gradient (SVRG) method, which leads to two algorithms: SGD-BB and SVRG-BB. Expand
  • 91
  • 14
  • PDF
Alternating Direction Methods for Latent Variable Gaussian Graphical Model Selection
TLDR
In this letter, we propose two alternating direction methods for solving the convex optimization problem for graphical model selection in the presence of unobserved variables. Expand
  • 82
  • 13
  • PDF
On the Global Linear Convergence of the ADMM with MultiBlock Variables
TLDR
We show that under some easily verifiable and reasonable conditions the global linear convergence of the alternating direction method of multipliers can still be ensured, which is important since the ADMM is a popular method for solving large-scale multiblock optimization models and is known to perform very well in practice even when $N\ge 3$. Expand
  • 120
  • 10
  • PDF
Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis
TLDR
This paper aims to take one step in the direction of disciplined nonconvex and nonsmooth optimization, using proximal-type variants of the ADMM to solve such a model, assuming the proximal ADMM updates can be implemented for all the block variables. Expand
  • 57
  • 10
  • PDF