Learn More
We generalize the primal-dual hybrid gradient (PDHG) algorithm proposed by Zhu and Chan in [M. Zhu, and T. F. Chan, An Efficient Primal-Dual Hybrid Gradient Algorithm for Total Variation Image Restoration, UCLA CAM Report [08-34], May 2008] to a broader class of convex optimization problems. In addition, we survey several closely related methods and explain(More)
A collaborative convex framework for factoring a data matrix <i>X</i> into a nonnegative product <i>AS</i> , with a sparse coefficient matrix <i>S</i>, is proposed. We restrict the columns of the dictionary matrix <i>A</i> to coincide with certain columns of the data matrix <i>X</i>, thereby guaranteeing a physically meaningful dictionary and dimensionality(More)
We generalize the primal-dual hybrid gradient (PDHG) algorithm proposed by Zhu and Chan in [M. Zhu, and T. F. Chan, An Efficient Primal-Dual Hybrid Gradient Algorithm for Total Variation Image Restoration, UCLA CAM Report [08-34], May 2008], draw connections to similar methods and discuss convergence of several special cases and modifications. In(More)
The Primal-Dual hybrid gradient (PDHG) method is a powerful optimization scheme that breaks complex problems into simple sub-steps. Unfortunately, PDHG methods require the user to choose stepsize parameters, and the speed of convergence is highly sensitive to this choice. We introduce new adaptive PDHG schemes that automatically tune the stepsize parameters(More)
A collaborative convex framework for factoring a data matrixX into a non-negative product AS, with a sparse coefficient matrix S, is introduced. We restrict the columns of the dictionary matrix A to coincide with certain columns of X , thereby guaranteeing a physically meaningful dictionary and dimensionality reduction. As an example, we show applications(More)
This research examines methods of implying deconvolution to blurry barcode signals with noise. Our goal is to to take these signals and recontrust them, using Yu Mao’s method of Gradient Projection, to be as clear as possible. This research examines the work of Yu Mao [5], alumni from the University of Minnesota. Our research is motivated by Yu Mao’s(More)
The Primal-Dual Hybrid Gradient method is a powerful splitting scheme for largescale constrained and non-differentiable problems. We present practical adaptive variants of PDHG that converge more quicky and are easier to use than conventional splitting schemes. We also study the convergence of PDHG, and prove new results guaranteeing convergence of the(More)