# Supervised Learning with Quantum-Inspired Tensor Networks

@article{Stoudenmire2016SupervisedLW, title={Supervised Learning with Quantum-Inspired Tensor Networks}, author={Edwin Miles Stoudenmire and David J. Schwab}, journal={ArXiv}, year={2016}, volume={abs/1605.05775} }

Tensor networks are efficient representations of high-dimensional tensors which have been very successful for physics and mathematics applications. We demonstrate how algorithms for optimizing such networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize models for classifying images. For the MNIST data set we obtain less than 1% test set classification error. We discuss how the tensor network form imparts additional structure to the…

## Figures and Topics from this paper

## 87 Citations

Tensor networks and machine learning for approximating and optimizing functions in quantum physics

- Computer Science
- 2018

Novel methods for the approximation of physical quantities and the optimization of performance criteria in quantum control are introduced, analyzed and evaluated based on techniques from the fields of tensor networks, numerical analysis and machine learning.

Quantum-inspired event reconstruction with Tensor Networks: Matrix Product States

- Computer Science, PhysicsArXiv
- 2021

This study presents the discrimination of top quark signal over QCD background processes using a Matrix Product State classifier and shows that entanglement entropy can be used to interpret what a network learns, which can beused to reduce the complexity of the network and feature space without loss of generality or performance.

TensorNetwork: A Library for Physics and Machine Learning

- Mathematics, PhysicsArXiv
- 2019

The use of the TensorNetwork API is demonstrated with applications both physics and machine learning, with details appearing in companion papers.

Supervised learning with generalized tensor networks

- Mathematics, Computer ScienceArXiv
- 2018

This work explores the connection between tensor networks and probabilistic graphical models, and shows that it motivates the definition of generalized Tensor networks where information from a tensor can be copied and reused in other parts of the network.

Number-State Preserving Tensor Networks as Classifiers for Supervised Learning

- Computer Science, PhysicsArXiv
- 2019

A restricted class of tensor network state, built from number-state preserving tensors, is proposed, argued to be a natural choice for classifiers and to be as powerful as generic (unrestricted) tensor networks in this task.

An end-to-end trainable hybrid classical-quantum classifier

- Physics, Computer ScienceMachine Learning: Science and Technology
- 2021

It is shown that compared to the principal component analysis, a tensor network based on the matrix product state with low bond dimensions performs better as a feature extractor for the input data of the variational quantum circuit in the binary and ternary classification of MNIST and Fashion-MNIST datasets.

TensorNetwork for Machine Learning

- Computer Science, PhysicsArXiv
- 2019

The encoding of image data into a matrix product state form is explained in detail, and how to contract the network in a way that is parallelizable and well-suited to automatic gradients for optimization is described.

Tree Tensor Networks for Generative Modeling

- Computer Science, PhysicsPhysical Review B
- 2019

It is shown that the TTN is superior to MPSs for generative modeling in keeping the correlation of pixels in natural images, as well as giving better log-likelihood scores in standard data sets of handwritten digits.

Robust supervised learning based on tensor network method

- Computer Science2018 33rd Youth Academic Annual Conference of Chinese Association of Automation (YAC)
- 2018

Methods to decompose the long-chain TN into short chains are proposed, which could improve the convergence property of the training algorithm by allowing stable stochastic gradient descent (SGD).

Tensor network language model

- Computer Science, PhysicsArXiv
- 2017

We propose a new statistical model suitable for machine learning of systems with long distance correlations such as natural languages. The model is based on directed acyclic graph decorated by…

## References

SHOWING 1-10 OF 61 REFERENCES

Perfect Sampling with Unitary Tensor Networks

- Mathematics, Physics
- 2012

Tensor network states are powerful variational Ansatze for many-body ground states of quantum lattice models. The use of Monte Carlo sampling techniques in tensor network approaches significantly…

The Alternating Linear Scheme for Tensor Optimization in the Tensor Train Format

- Mathematics, Computer ScienceSIAM J. Sci. Comput.
- 2012

This article shows how optimization tasks can be treated in the TT format by a generalization of the well-known alternating least squares (ALS) algorithm and by a modified approach (MALS) that enables dynamical rank adaptation.

Hand-waving and Interpretive Dance: An Introductory Course on Tensor Networks

- Mathematics, Physics
- 2016

The curse of dimensionality associated with the Hilbert space of spin systems provides a significant obstruction to the study of condensed matter systems. Tensor networks have proven an important…

Tensor Networks for Big Data Analytics and Large-Scale Optimization Problems

- Computer Science, MathematicsArXiv
- 2014

The main objective of this paper is to show how tensor networks can be used to solve a wide class of big data optimization problems by applying tensorization and performing all operations using relatively small size matrices and tensors and applying iteratively optimized and approximative tensor contractions.

An exact mapping between the Variational Renormalization Group and Deep Learning

- Computer Science, MathematicsArXiv
- 2014

This work constructs an exact mapping from the variational renormalization group, first introduced by Kadanoff, and deep learning architectures based on Restricted Boltzmann Machines (RBMs), and suggests that deep learning algorithms may be employing a generalized RG-like scheme to learn relevant features from data.

Tensor Network States and Geometry

- Mathematics, Physics
- 2011

Tensor network states are used to approximate ground states of local Hamiltonians on a lattice in D spatial dimensions. Different types of tensor network states can be seen to generate different…

Tensorizing Neural Networks

- Computer ScienceNIPS
- 2015

This paper converts the dense weight matrices of the fully-connected layers to the Tensor Train format such that the number of parameters is reduced by a huge factor and at the same time the expressive power of the layer is preserved.

Exponential Machines

- Computer ScienceICLR
- 2017

This paper introduces Exponential Machines (ExM), a predictor that models all interactions of every order in a factorized format called Tensor Train (TT), and shows that the model achieves state-of-the-art performance on synthetic data with high-order interactions and works on par on a recommender system dataset MovieLens 100K.

Tensor Network Renormalization.

- Physics, MedicinePhysical review letters
- 2015

We introduce a coarse-graining transformation for tensor networks that can be applied to study both the partition function of a classical statistical system and the Euclidean path integral of a…

Tensor-Train Decomposition

- Mathematics, Computer ScienceSIAM J. Sci. Comput.
- 2011

The new form gives a clear and convenient way to implement all basic operations efficiently, and the efficiency is demonstrated by the computation of the smallest eigenvalue of a 19-dimensional operator.