Optimization of Functions Given in the Tensor Train Format

@article{Chertkov2022OptimizationOF,
  title={Optimization of Functions Given in the Tensor Train Format},
  author={Andrei Chertkov and Gleb V. Ryzhakov and Georgii Sergeevich Novikov and I. Oseledets},
  journal={ArXiv},
  year={2022},
  volume={abs/2209.14808}
}
—Tensor train (TT) format is a common approach for computationally efficient work with multidimensional arrays, vectors, matrices, and discretized functions in a wide range of applications, including computational mathematics and machine learning. In this work, we propose a new algorithm for TT-tensor optimization, which leads to very accurate approximations for the minimum and maximum tensor element. The method consists in sequential tensor multiplications of the TT-cores with an intelligent… 
1 Citations

Figures and Tables from this paper

PROTES: Probabilistic Optimization with Tensor Sampling

In numerical experiments, the new method PROTES, based on a probabilistic sampling from a probability density function given in the low-parametric tensor train format, outperform existing popular discrete optimization methods.

References

SHOWING 1-10 OF 13 REFERENCES

Black box approximation in the tensor train format initialized by ANOVA decomposition

The performed numerical computations for a number of multidimensional model problems, including the parametric partial differential equation, demonstrate a significant advantage of the approach for the commonly used random initial approximation.

Tensor-Train Decomposition

The new form gives a clear and convenient way to implement all basic operations efficiently, and the efficiency is demonstrated by the computation of the smallest eigenvalue of a 19-dimensional operator.

Iterative Power Algorithm for Global Optimization with Quantics Tensor Trains.

The iterative power algorithm (IPA) for global optimization and a formal proof of convergence for both discrete and continuous global search problems, which is essential for applications in chemistry such as molecular geometry optimization.

A Hybrid Alternating Least Squares-TT-Cross Algorithm for Parametric PDEs

An algorithm is proposed that is a hybrid of the alternating least squares and the TT cross methods that exploits and preserves the block diagonal structure of the discretized operator in stochastic collocation schemes and computes a TT approximation of the whole solution.

Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions

A focus is on the Tucker and tensor train TT decompositions and their extensions, and on demonstrating the ability of tensor network to provide linearly or even super-linearly e.g., logarithmically scalablesolutions, as illustrated in detail in Part 2 of this monograph.

Are Quantum Computers Practical Yet? A Case for Feature Selection in Recommender Systems using Tensor Networks

Collaborative filtering models generally perform better than content-based filtering models and do not require careful feature engineering. However, in the cold-start scenario collaborative

Global Optimization of Surface Warpage for Inverse Design of Ultra-Thin Electronic Packages using Tensor Train Decomposition

An inverse design framework is implemented to optimize the surface warpage profiles of the package using heuristic global optimization method viz.

A literature survey of benchmark functions for global optimisation problems

This work has reviewed and compiled a rich set of 175 benchmark functions for unconstrained optimisation problems with diverse properties in terms of modality, separability, and valley landscape, by far the most complete set of functions so far in the literature.

I and i

There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.