#### Filter Results:

- Full text PDF available (73)

#### Publication Year

1978

2017

- This year (2)
- Last 5 years (26)
- Last 10 years (40)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

- Scott Saobing Chen, David L. Donoho, Michael A. Saunders
- SIAM Review
- 1998

The time-frequency and timescale communities have recently developed a large number of overcomplete waveform dictionaries — stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several methods for decomposition have been proposed, including the method of… (More)

- Christopher C. Paige, Michael A. Saunders
- ACM Trans. Math. Softw.
- 1982

An iterative method is given for solving Ax ~ffi b and minU Ax-b 112, where the matrix A is large and sparse. The method is based on the bidiagonalization procedure of Golub and Kahan. It is analytically equivalent to the standard method of conjugate gradients, but possesses more favorable numerical properties. Reliable stopping criteria are derived, along… (More)

- Philip E. Gill, Walter Murray, Michael A. Saunders
- SIAM Review
- 2002

Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available and that the constraint gradients are… (More)

- Bruce A. Murtagh, Michael A. Saunders
- Math. Program.
- 1978

- Jason D. Lee, Yuekai Sun, Michael A. Saunders
- SIAM Journal on Optimization
- 2014

We generalize Newton-type methods for minimizing smooth functions to handle a sum of two convex functions: a smooth function and a nonsmooth function with a simple prox-imal mapping. We show that the resulting proximal Newton-type methods inherit the desirable convergence behavior of Newton-type methods for minimizing smooth functions, even when search… (More)

- Christopher C. Paige, Michael A. Saunders
- ACM Trans. Math. Softw.
- 1982

- David Chin-Lung Fong, Michael A. Saunders
- SIAM J. Scientific Computing
- 2011

An iterative method LSMR is presented for solving linear systems Ax = b and least-squares problems min Ax − b 2 , with A being sparse or a fast linear operator. LSMR is based on the Golub-Kahan bidiagonalization process. It is analytically equivalent to the MINRES method applied to the normal equation A T Ax = A T b, so that the quantities A T r k are… (More)

- Michael P. Friedlander, Michael A. Saunders
- SIAM Journal on Optimization
- 2005

ii I certify that I have read this dissertation and that, in my opinion, it is fully adequate in scope and quality as a disser-tation for the degree of Doctor of Philosophy. I certify that I have read this dissertation and that, in my opinion, it is fully adequate in scope and quality as a disser-tation for the degree of Doctor of Philosophy. I certify that… (More)

- Philip E Gillt, Walter Murray, Dulce B Poncelen, Michael A Saunders
- 1992

Methods are discussed for the solution of sparse linear equations Ky z, where K is symmetric and indefinite. Since exact solutions are not always required, direct and iterative methods are both of interest. An important direct method is the Bunch-Parlett factorization K UTDU, where U is triangular and D is block-diagonal. A sparse implementation exists in… (More)

In barrier methods for constrained optimization, the main work lies in solving large linear systems Kp = r, where K is symmetric and indefinite. For linear programs, these KKT systems are usually reduced to smaller positive-definite systems AH −1 A T q = s, where H is a large principal submatrix of K. These systems can be solved more efficiently, but AH −1… (More)