Data Structures and Programming Techniques for the Implementation of Karmarkar's Algorithm

  title={Data Structures and Programming Techniques for the Implementation of Karmarkar's Algorithm},
  author={Ilan Adler and Narendra Karmarkar and Mauricio G. C. Resende and Geraldo Veiga},
  journal={INFORMS J. Comput.},
This paper describes data structures and programming techniques used in an implementation of Karmarkar's algorithm for linear programming. Most of our discussion focuses on applying Gaussian elimination toward the solution of a sequence of sparse symmetric positive definite systems of linear equations, the main requirement in Karmarkar's algorithm. Our approach relies on a direct factorization scheme, with an extensive symbolic factorization step performed in a preparatory stage of the linear… 

Topics from this paper

An implementation of Karmarkar's algorithm for linear programming
Based on a continuous version of Karmarkar's algorithm, two variants resulting from first and second order approximations of the continuous trajectory are implemented and tested and compares favorably with the simplex codeMinos 4.0.
Implementation and computational results for the hierarchical algorithm for making sparse matrices sparser
This paper found that HASP substantially outperformed a previous code for SP and that it produced a net savings in optimization time on the NETLIB problems.
A parallel interior point algorithm for linear programming on a network of transputers
A parallel Dual Affine algorithm is presented which is suitable for a parallel computer with a distributed memory and obtains its speedup from parallel sparse linear algebra computations such as Cholesky factorisation, matrix multiplication, and triangular system solving.
Computational results of an interior point algorithm for large scale linear programming
Computational results for an efficient implementation of a variant of dual projective algorithm for linear programming using the preconditioned conjugate gradient method for computing projections indicates that this algorithm has potential as an alternative for solving very large LPs in which the direct methods fail due to memory and CPU time requirements.
Presolve Analysis of Linear Programs Prior to Applying anInterior Point Method
Several issues concerning an analysis of large and sparse linear programming problems prior to solving them with an interior point based optimizer are addressed in this paper. Three types of presolve
Dikin ' s Convergence Result for the Affine-Scaling Algorithm
The affine-scaling algorithm is an analogue of Karmarkar's linear programming algorithm that uses affine transformations instead of projective transformations. Although this variant lacks some of the
Making sparse matrices sparser: Computational results
This work considers the problem of making a given matrix as sparse as possible, thearsity Problem, and reports encouraging computational results in making linear programming constraint matrices sparser.
Using a Massively Parallel Processor to Solve Large Sparse Linear Programs by an Interior-Point Method
This work describes a strategy that uses a serial "front-end" computer to carry out the sparse part of the elimination and a massively parallel processor to complete the elimination on the dense block.
Making sparse matrices sparser : Computational results
Many optimization algorithms involve repeated processing of a fixed set of linear constraints. If we pre-process the constraint matrix A to be sparser, then algebraic operations on A will become
A hierarchical algorithm for making sparse matrices sparser
Computational results indicate that this approach to increasing sparsity produces significant net reductions in simplex solution time.