• Corpus ID: 197488

Sparsified Cholesky Solvers for SDD linear systems

@article{Lee2015SparsifiedCS,
  title={Sparsified Cholesky Solvers for SDD linear systems},
  author={Yin Tat Lee and Richard Peng and Daniel A. Spielman},
  journal={ArXiv},
  year={2015},
  volume={abs/1506.08204}
}
We show that Laplacian and symmetric diagonally dominant (SDD) matrices can be well approximated by linear-sized sparse Cholesky factorizations. We show that these matrices have constant-factor approximations of the form $L L^{T}$, where $L$ is a lower-triangular matrix with a number of nonzero entries linear in its dimension. Furthermore linear systems in $L$ and $L^{T}$ can be solved in $O (n)$ work and $O(\log{n}\log^2\log{n})$ depth, where $n$ is the dimension of the matrix. We present… 

Figures from this paper

Sparsified Cholesky and multigrid solvers for connection laplacians

TLDR
These new algorithms are used to derive the first nearly linear time algorithms for solving systems of equations in connection Laplacians---a generalization of LaplACian matrices that arise in many problems in image and signal processing.

Minor Sparsifiers and the Distributed Laplacian Paradigm

TLDR
This work studies distributed algorithms built around minor-based vertex sparsifiers, gives the first algorithm in the CONGEST model for solving linear systems in graph Laplacian matrices to high accuracy and presents a nontrivial distributed implementation of their construction.

Solving Directed Laplacian Systems in Nearly-Linear Time through Sparse LU Factorizations

TLDR
It is shown that Eulerian Laplacians (and therefore the LaPLacians of all strongly connected directed graphs) have sparse approximate LU-factorizations, and it is proved that once constructed they yield nearly-linear time algorithms for solving directed LaplACian systems.

Numerical Linear Algebra in the Sliding Window Model

TLDR
This work gives a deterministic algorithm that achieves spectral approximation in the sliding window model that can be viewed as a generalization of smooth histograms, using the Loewner ordering of PSD matrices, and gives algorithms for both spectral approximation and low-rank approximation that are space-optimal up to polylogarithmic factors.

Input Sparsity Time Low-rank Approximation via Ridge Leverage Score Sampling

We present a new algorithm for finding a near optimal low-rank approximation of a matrix $A$ in $O(nnz(A))$ time. Our method is based on a recursive sampling scheme for computing a representative

An Efficient Parallel Algorithm for Spectral Sparsification of Laplacian and SDDM Matrix Polynomials

TLDR
The results yield the first efficient and parallel algorithm that runs in nearly linear work and poly-logarithmic depth and analyzes the long term behaviour of Markov chains in non-trivial settings.

Flows in almost linear time via adaptive preconditioning

TLDR
This work gives an alternate approach for approximating undirected max-flow, and the first almost-linear time approximations of discretizations of total variation minimization objectives.

Solving Linear Programs with Sqrt(rank) Linear System Solves

TLDR
An deterministic polynomial time computable $\tilde{O}(rank(A))$-self-concordant barrier function for the polytope is provided, resolving an open question of Nesterov and Nemirovski (1994) on the theory of "universal barriers" for interior point methods.

Constructing Linear-Sized Spectral Sparsification in Almost-Linear Time

  • Y. LeeHe Sun
  • Computer Science, Mathematics
    2015 IEEE 56th Annual Symposium on Foundations of Computer Science
  • 2015
TLDR
This work presents the first almost-linear time algorithm for constructing linear-sized spectral sparsification for graphs, using a novel combination of two techniques used in literature for constructing spectralSparsification: Random sampling by effective resistance and adaptive constructions based on barrier functions.

Solving Linear Programs with Õ ( √ rank ) Linear System Solves

TLDR
The first proof of a Õ(r)-self-concordant barrier for all polytopes with r = rank(A) that is polynomial time computable is provided, resolving an open question of Nesterov and Nemirovski (1994) on the theory of “universal barriers” for interior point methods.

References

SHOWING 1-10 OF 39 REFERENCES

Solving SDD linear systems in nearly mlog1/2n time

TLDR
This work shows that preconditioners constructed by random sampling can perform well without meeting the standard requirements of iterative methods, and obtains a two-pass approach algorithm for constructing optimal embeddings in snowflake spaces that runs in O(m log log n) time.

An efficient parallel solver for SDD linear systems

TLDR
This work presents the first parallel algorithm for solving systems of linear equations in symmetric, diagonally dominant (SDD) matrices that runs in polylogarithmic time and nearly-linear work, and requires spectral graph sparsifiers.

A simple, combinatorial algorithm for solving SDD systems in nearly-linear time

TLDR
A simple combinatorial algorithm that solves symmetric diagonally dominant (SDD) linear systems in nearly-linear time and has the fastest known running time under the standard unit-cost RAM model.

Spectral Sparsification and Regret Minimization Beyond Matrix Multiplicative Updates

TLDR
This paper explains how matrix MWU naturally arises as an instance of the Follow-the-Regularized-Leader framework and generalizes this approach to yield a larger class of updates, which allows the construction of linear-sized spectral sparsifiers, and gives novel insights on the motivation behind Batson, Spielman and Srivastava.

Solving Elliptic Finite Element Systems in Near-Linear Time with Support Preconditioners

TLDR
This work considers linear systems arising from the use of the finite element method for solving a certain class of linear elliptic problems, which are symmetric and positive semidefinite and well approximated by symmetric diagonally dominant matrices.

Twice-ramanujan sparsifiers

TLDR
It is proved that every graph has a spectral sparsifier with a number of edges linear in its number of vertices, and an elementary deterministic polynomial time algorithm is given for constructing H, which approximates G spectrally at least as well as a Ramanujan expander with dn/2 edges approximates the complete graph.

An Algebraic Multigrid Method with Guaranteed Convergence Rate

TLDR
An algebraic multigrid method is presented which has a guaranteed convergence rate for the class of nonsingular symmetric M-matrices with nonnegative row sum and is analytically shown to hold for the model Poisson problem.

Uniform Sampling for Matrix Approximation

TLDR
It is shown that uniform sampling yields a matrix that, in some sense, well approximates a large fraction of the original, which leads to simple iterative row sampling algorithms for matrix approximation that run in input-sparsity time and preserve row structure and sparsity at all intermediate steps.

Simple Parallel and Distributed Algorithms for Spectral Graph Sparsification

TLDR
This work obtains the first distributed spectral sparsification algorithm in the CONGEST model, and combines this algorithm with the parallel framework of Peng and Spielman for solving symmetric diagonally dominant linear systems.

Efficient preconditioning of laplacian matrices for computer graphics

We present a new multi-level preconditioning scheme for discrete Poisson equations that arise in various computer graphics applications such as colorization, edge-preserving decomposition for