Corpus ID: 15905470

Supervised Learning with Quantum-Inspired Tensor Networks

@article{Stoudenmire2016SupervisedLW,
  title={Supervised Learning with Quantum-Inspired Tensor Networks},
  author={Edwin Miles Stoudenmire and David J. Schwab},
  journal={ArXiv},
  year={2016},
  volume={abs/1605.05775}
}
Tensor networks are efficient representations of high-dimensional tensors which have been very successful for physics and mathematics applications. We demonstrate how algorithms for optimizing such networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize models for classifying images. For the MNIST data set we obtain less than 1% test set classification error. We discuss how the tensor network form imparts additional structure to the… Expand
Tensor networks and machine learning for approximating and optimizing functions in quantum physics
TLDR
Novel methods for the approximation of physical quantities and the optimization of performance criteria in quantum control are introduced, analyzed and evaluated based on techniques from the fields of tensor networks, numerical analysis and machine learning. Expand
Quantum-inspired event reconstruction with Tensor Networks: Matrix Product States
TLDR
This study presents the discrimination of top quark signal over QCD background processes using a Matrix Product State classifier and shows that entanglement entropy can be used to interpret what a network learns, which can beused to reduce the complexity of the network and feature space without loss of generality or performance. Expand
TensorNetwork: A Library for Physics and Machine Learning
TLDR
The use of the TensorNetwork API is demonstrated with applications both physics and machine learning, with details appearing in companion papers. Expand
Supervised learning with generalized tensor networks
TLDR
This work explores the connection between tensor networks and probabilistic graphical models, and shows that it motivates the definition of generalized Tensor networks where information from a tensor can be copied and reused in other parts of the network. Expand
Number-State Preserving Tensor Networks as Classifiers for Supervised Learning
TLDR
A restricted class of tensor network state, built from number-state preserving tensors, is proposed, argued to be a natural choice for classifiers and to be as powerful as generic (unrestricted) tensor networks in this task. Expand
An end-to-end trainable hybrid classical-quantum classifier
TLDR
It is shown that compared to the principal component analysis, a tensor network based on the matrix product state with low bond dimensions performs better as a feature extractor for the input data of the variational quantum circuit in the binary and ternary classification of MNIST and Fashion-MNIST datasets. Expand
TensorNetwork for Machine Learning
TLDR
The encoding of image data into a matrix product state form is explained in detail, and how to contract the network in a way that is parallelizable and well-suited to automatic gradients for optimization is described. Expand
Tree Tensor Networks for Generative Modeling
TLDR
It is shown that the TTN is superior to MPSs for generative modeling in keeping the correlation of pixels in natural images, as well as giving better log-likelihood scores in standard data sets of handwritten digits. Expand
Robust supervised learning based on tensor network method
  • Y. Chen, K. Guo, Y. Pan
  • Computer Science
  • 2018 33rd Youth Academic Annual Conference of Chinese Association of Automation (YAC)
  • 2018
TLDR
Methods to decompose the long-chain TN into short chains are proposed, which could improve the convergence property of the training algorithm by allowing stable stochastic gradient descent (SGD). Expand
Tensor network language model
We propose a new statistical model suitable for machine learning of systems with long distance correlations such as natural languages. The model is based on directed acyclic graph decorated byExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 61 REFERENCES
Perfect Sampling with Unitary Tensor Networks
Tensor network states are powerful variational Ansatze for many-body ground states of quantum lattice models. The use of Monte Carlo sampling techniques in tensor network approaches significantlyExpand
The Alternating Linear Scheme for Tensor Optimization in the Tensor Train Format
TLDR
This article shows how optimization tasks can be treated in the TT format by a generalization of the well-known alternating least squares (ALS) algorithm and by a modified approach (MALS) that enables dynamical rank adaptation. Expand
Hand-waving and Interpretive Dance: An Introductory Course on Tensor Networks
The curse of dimensionality associated with the Hilbert space of spin systems provides a significant obstruction to the study of condensed matter systems. Tensor networks have proven an importantExpand
Tensor Networks for Big Data Analytics and Large-Scale Optimization Problems
TLDR
The main objective of this paper is to show how tensor networks can be used to solve a wide class of big data optimization problems by applying tensorization and performing all operations using relatively small size matrices and tensors and applying iteratively optimized and approximative tensor contractions. Expand
An exact mapping between the Variational Renormalization Group and Deep Learning
TLDR
This work constructs an exact mapping from the variational renormalization group, first introduced by Kadanoff, and deep learning architectures based on Restricted Boltzmann Machines (RBMs), and suggests that deep learning algorithms may be employing a generalized RG-like scheme to learn relevant features from data. Expand
Tensor Network States and Geometry
Tensor network states are used to approximate ground states of local Hamiltonians on a lattice in D spatial dimensions. Different types of tensor network states can be seen to generate differentExpand
Tensorizing Neural Networks
TLDR
This paper converts the dense weight matrices of the fully-connected layers to the Tensor Train format such that the number of parameters is reduced by a huge factor and at the same time the expressive power of the layer is preserved. Expand
Exponential Machines
TLDR
This paper introduces Exponential Machines (ExM), a predictor that models all interactions of every order in a factorized format called Tensor Train (TT), and shows that the model achieves state-of-the-art performance on synthetic data with high-order interactions and works on par on a recommender system dataset MovieLens 100K. Expand
Tensor Network Renormalization.
We introduce a coarse-graining transformation for tensor networks that can be applied to study both the partition function of a classical statistical system and the Euclidean path integral of aExpand
Tensor-Train Decomposition
  • I. Oseledets
  • Mathematics, Computer Science
  • SIAM J. Sci. Comput.
  • 2011
TLDR
The new form gives a clear and convenient way to implement all basic operations efficiently, and the efficiency is demonstrated by the computation of the smallest eigenvalue of a 19-dimensional operator. Expand
...
1
2
3
4
5
...