Learn More
The time-frequency and timescale communities have recently developed a large number of overcomplete waveform dictionaries — stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several methods for decomposition have been proposed, including the method of(More)
The lasso penalizes a least squares regression by the sum of the absolute values (L 1-norm) of the coefficients. The form of this penalty encourages sparse solutions (with many coefficients equal to 0). We propose the 'fused lasso', a generalization that is designed for problems with features that can be ordered in some meaningful way. The fused lasso(More)
An iterative method is given for solving Ax ~ffi b and minU Ax-b 112, where the matrix A is large and sparse. The method is based on the bidiagonalization procedure of Golub and Kahan. It is analytically equivalent to the standard method of conjugate gradients, but possesses more favorable numerical properties. Reliable stopping criteria are derived, along(More)
Sequential quadratic programming (SQP) methods have proved highly eeective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that rst derivatives are available, and that the constraint gradients are(More)
An algorithm for solving large-scale nonlinear' programs with linear constraints is presented. The method combines efficient sparse-matrix techniques as in the revised simplex method with stable quasi-Newton methods for handling the nonlinearities. A general-purpose production code (MINOS) is described, along with computational experience on a wide variety(More)
We generalize Newton-type methods for minimizing smooth functions to handle a sum of two convex functions: a smooth function and a nonsmooth function with a simple prox-imal mapping. We show that the resulting proximal Newton-type methods inherit the desirable convergence behavior of Newton-type methods for minimizing smooth functions, even when search(More)
SNOPT is a set of Fortran subroutines for minimizing a smooth function subject to constraints, which may include simple bounds on the variables, linear constraints and smooth nonlinear constraints. SNOPT is a general-purpose optimizer, designed to find locally optimal solutions for models involving smooth nonlinear functions. They are often more widely(More)
NPSOL is a set of Fortran subroutines for minimizing a smooth function subject to constraints, which may include simple bounds on the variables, linear constraints and smooth nonlinear constraints. (NPSOL may also be used for unconstrained, bound-constrained and linearly constrained optimization.) The user provides subroutines to define the objective and(More)