Large-scale linearly constrained optimization

@article{Murtagh1978LargescaleLC,
  title={Large-scale linearly constrained optimization},
  author={B. Murtagh and M. Saunders},
  journal={Mathematical Programming},
  year={1978},
  volume={14},
  pages={41-72}
}
An algorithm for solving large-scale nonlinear programs with linear constraints is presented. The method combines efficient sparse-matrix techniques as in the revised simplex method with stable quasi-Newton methods for handling the nonlinearities. A general-purpose production code (MINOS) is described, along with computational experience on a wide variety of problems. 
A Solver for large-scale indefinite quadratic programs
Based on an active set strategy, a method for solving linearly constrained indefinite quadratic programs to solve the corresponding system of equations at each iteration is presented. The algorithmExpand
Computing a search direction for large scale linearly constrained nonlinear optimization calculations
We consider the computation of Newton-like search directions that are appropriate when solving large-scale linearly-constrained nonlinear optimization problems. We investigate the use of both directExpand
An algorithm for large-scale linearly constrained nondifferentiable convex minimization
A partial proximal bundle method is given for solving a large convex program obtained by augmenting the objective of a linear program with a non-smooth convex function depending on relatively fewExpand
A primal truncated newton algorithm with application to large-scale nonlinear network optimization
We describe a new, convergent, primal-feasible algorithm for linearly constrained optimization. It is capable of rapid asymptotic behavior and has relatively low storage requirements. Its applicationExpand
A Numerically stable reduced-gradient type algorithm for solving large-scale linearly constrained minimization problems
TLDR
A reduced-gradient type algorithm for solving large-scale linearly constrained minimization problems using a preconditioned conjugate-gradient scheme that provides numerical stability and total storage may be predicted before beginning the calculations. Expand
A projected Lagrangian algorithm and its implementation for sparse nonlinear constraints
An algorithm is described for solving large-scale nonlinear programs whose objective and constraint functions are smooth and continuously differentiable. The algorithm is of the projected LagrangianExpand
A direct search approach to nonlinear integer programming
An approach to the solution of large-scale nonlinear programming problems with initeger restrictions on some of the variables is described. The method is based on the MINOS large-scale optimizationExpand
An algorithm for linearly constrained programs with a partly linear objective function
A Newton-type algorithm for finding a minimum of a function, subject to linear constraints, which is linear in some of its arguments is presented. It employs an active set strategy especially adaptedExpand
Nonlinear programming on generalized networks
TLDR
A specialization of the primal truncated Newton algorithm for solving nonlinear optimization problems on networks with gains, able to capitalize on the special structure of the constraints. Expand
On Modified Factorizations for Large-Scale Linearly Constrained Optimization
  • N. Gould
  • Mathematics, Computer Science
  • SIAM J. Optim.
  • 1999
TLDR
The main issue addressed is how to ensure that a quadratic model of the objective function is positive definite in the null-space of the constraints while neither adversely affecting the convergence of Newton's method nor incurring a significant computational overhead. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 37 REFERENCES
Newton-type methods for unconstrained and linearly constrained optimization
TLDR
The methods are intimately based on the recurrence of matrix factorizations and are linked to earlier work on quasi-Newton methods and quadratic programming. Expand
Projection methods for non-linear programming
Several algorithms are presented for solving the non-linear programming problem, based on “variable-metric” projections of the gradient of the objective function into a local approximation to theExpand
THE SIMPLEX METHOD FOR QUADRATIC PROGRAMMING
Abstract : A computational procedure is given for finding the minimum of a quadratic function of variables subject to linear inequality constraints. The procedure is analogous to the Simplex MethodExpand
Aspects of large-scale in-core linear programming
TLDR
Minor changes in the computational aspects of the simplex algorithm coupled with efficient inverse matrix representation show that the major portion of the inverse in product form of a basis may be embedded in the constraint matrix. Expand
A Second Order Method for the Linearly Constrained Nonlinear Programming Problem
ABSTRACT An algorithm using second derivatives for solving the problem: minimize f(x) subject to Ax − b ≥ 0 is presented. Convergence to a Second-Order Kuhn Tucker Point is proved. If the strictExpand
Robust implementation of Lemke's method for the linear complementarity problem
This note discusses techniques for implementing Lemke's algorithm for the linear complementarity problem in a numerically robust way as well as a method for recovering from loss of feasibility orExpand
The Variable Reduction Method for Nonlinear Programming
A first-order method for solving the problem: minimize f(x) subject to Ax - b \geqq 0 is presented. The method contains ideas based on variable reduction with anti-zig-zagging and accelerationExpand
The simplex method of linear programming using LU decomposition
TLDR
The theoretical background for an implementation which is based upon the LU decomposition, computed with row interchanges, of the basic matrix of the simplex method, which is slow, but has good round-off error behavior. Expand
Linear Programming via a Nondifferentiable Penalty Function
A numerically stable form of an algorithm that is closely related to the work of Gill and Murray [5] and Conn [3] is presented.Among other reasons, the penalty function approach has never beenExpand
A Rapidly Convergent Descent Method for Minimization
TLDR
A number of theorems are proved to show that it always converges and that it converges rapidly, and this method has been used to solve a system of one hundred non-linear simultaneous equations. Expand
...
1
2
3
4
...