Preconditioned Low-rank Riemannian Optimization for Linear Systems with Tensor Product Structure

Abstract

The numerical solution of partial differential equations on high-dimensional domains gives rise to computationally challenging linear systems. When using standard discretization techniques, the size of the linear system grows exponentially with the number of dimensions, making the use of classic iterative solvers infeasible. During the last few years, low-rank tensor approaches have been developed that allow to mitigate this curse of dimensionality by exploiting the underlying structure of the linear operator. In this work, we focus on tensors represented in the Tucker and tensor train formats. We propose two preconditioned gradient methods on the corresponding low-rank tensor manifolds: A Riemannian version of the preconditioned Richardson method as well as an approximate Newton scheme based on the Riemannian Hessian. For the latter, considerable attention is given to the efficient solution of the resulting Newton equation. In numerical experiments, we compare the efficiency of our Riemannian algorithms with other established tensor-based approaches such as a truncated preconditioned Richardson method and the alternating linear scheme. The results show that our approximate Riemannian Newton scheme is significantly faster in cases when the application of the linear operator is expensive.

DOI: 10.1137/15M1032909

Extracted Key Phrases

8 Figures and Tables

Cite this paper

@article{Kressner2016PreconditionedLR, title={Preconditioned Low-rank Riemannian Optimization for Linear Systems with Tensor Product Structure}, author={Daniel Kressner and Michael Steinlechner and Bart Vandereycken}, journal={SIAM J. Scientific Computing}, year={2016}, volume={38} }