• Publications
  • Influence
Tensor-Train Decomposition
  • I. Oseledets
  • Mathematics, Computer Science
  • SIAM J. Sci. Comput.
  • 1 September 2011
A simple nonrecursive form of the tensor decomposition in $d$ dimensions is presented. It does not inherently suffer from the curse of dimensionality, it has asymptotically the same number ofExpand
  • 1,012
  • 162
Speeding-up Convolutional Neural Networks Using Fine-tuned CP-Decomposition
We propose a simple two-step approach for speeding up convolution layers within large convolutional neural networks based on tensor decomposition and discriminative fine-tuning. Given a layer, we useExpand
  • 447
  • 39
  • PDF
TT-cross approximation for multidimensional arrays
Abstract As is well known, a rank- r matrix can be recovered from a cross of r linearly independent columns and rows, and an arbitrary matrix can be interpolated on the cross entries. Other entriesExpand
  • 316
  • 31
  • PDF
Breaking the Curse of Dimensionality, Or How to Use SVD in Many Dimensions
For $d$-dimensional tensors with possibly large $d>3$, an hierarchical data structure, called the Tree-Tucker format, is presented as an alternative to the canonical decomposition. It hasExpand
  • 346
  • 21
  • PDF
Unifying time evolution and optimization with matrix product states
We show that the time-dependent variational principle provides a unifying framework for time-evolution methods and optimization methods in the context of matrix product states. In particular, weExpand
  • 170
  • 21
  • PDF
How to find a good submatrix
Pseudoskeleton approximation and some other problems require the knowledge of sufficiently well-conditioned submatrix in a large-scale matrix. The quality of a submatrix can be measured by modulus ofExpand
  • 138
  • 18
  • PDF
Solution of Linear Systems and Matrix Inversion in the TT-Format
Tensors arise naturally in high-dimensional problems in chemistry, financial mathematics, and many other areas. The numerical treatment of such problems is difficult due to the curse ofExpand
  • 138
  • 16
  • PDF
Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions
Modern applications in engineering and data science are increasinglybased on multidimensional data of exceedingly high volume, variety,and structural richness. However, standard machine learningExpand
  • 176
  • 13
  • PDF
A projector-splitting integrator for dynamical low-rank approximation
The dynamical low-rank approximation of time-dependent matrices is a low-rank factorization updating technique. It leads to differential equations for factors of the matrices, which need to be solvedExpand
  • 71
  • 12
  • PDF
Computation of extreme eigenvalues in higher dimensions using block tensor train format
We consider approximate computation of several minimal eigenpairs of large Hermitian matrices which come from high-dimensional problems. We use the tensor train (TT) format for vectors and matricesExpand
  • 74
  • 11
  • PDF