# Deterministic Tensor Network Classifiers

@inproceedings{Wright2022DeterministicTN, title={Deterministic Tensor Network Classifiers}, author={Lewis Wright and Fergus Barratt and James Dborin and V. Wimalaweera and B. Coyle and Ashley Green}, year={2022} }

L. Wright,1 F. Barratt,2 J. Dborin,3 V. Wimalaweera,3 B. Coyle,3 and A. G. Green3, 4 Department of Mathematics, King’s College London, Strand, London WC2R 2LS, United Kingdom Department of Physics, Amherst, Massachusetts, United States of America London Centre for Nanotechnology, University College London, Gordon St., London, WC1H 0AH, United Kingdom email: andrew.green@ucl.ac.uk (Dated: May 23, 2022)

## Figures and Tables from this paper

## References

SHOWING 1-10 OF 61 REFERENCES

Revisiting dequantization and quantum advantage in learning tasks

- Computer ScienceArXiv
- 2021

This research presents a probabilistic simulation of the black hole and its environment that allows for the simulation of supermassive black holes and their environment to be studied in detail.

Dissipative failure of adiabatic quantum transport as a dynamical phase transition

- PhysicsPhysical Review A
- 2021

F. Barratt, Aleix Bou Comas, 3 P. Crowley, V. Oganesyan, 3 P. Sollich, 5 and A. G. Green Department of Mathematics, King’s College London, Strand, London WC2R 2LS, United Kingdom Graduate Program in…

Entanglement and Tensor Networks for Supervised Image Classification

- Computer ScienceArXiv
- 2020

The use of tensor networks for supervised image classification using the MNIST data set of handwritten digits, as pioneered by Stoudenmire and Schwab, is revisited and entanglement properties are investigated.

Learning Relevant Features of Data with Multi-scale Tensor Networks

- Computer ScienceArXiv
- 2018

Inspired by coarse-graining approaches used in physics, it is shown how similar algorithms can be adapted for data based on layered tree tensor networks and scale linearly with both the dimension of the input and the training set size.

Supervised Learning with Tensor Networks

- Computer ScienceNIPS
- 2016

It is demonstrated how algorithms for optimizing tensor networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize non-linear kernel learning models.

Improvements to Gradient Descent Methods for Quantum Tensor Network Machine Learning

- Computer ScienceArXiv
- 2022

A ‘copy node’ method is introduced that successfully initializes arbitrary tensor networks, in addition to a gradient based regularization technique for bond dimensions that produces quantum-inspired tensor network models with far fewer parameters, while improving generalization performance.

Tree Tensor Networks for Generative Modeling

- Computer SciencePhysical Review B
- 2019

It is shown that the TTN is superior to MPSs for generative modeling in keeping the correlation of pixels in natural images, as well as giving better log-likelihood scores in standard data sets of handwritten digits.

Isometric Tensor Network States in Two Dimensions.

- Computer Science, PhysicsPhysical review letters
- 2020

An isometric restriction of the TNS ansatz that allows for highly efficient contraction of the network is introduced and it is shown that a matrix-product state representation of a 2D quantum state can be iteratively transformed into an isometric 2D TNS.

Properties of the After Kernel

- Computer ScienceArXiv
- 2021

The “after kernel” is studied, which is defined using the same embedding, except after training, for neural networks with standard architectures, on binary classification problems extracted from MNIST and CIFAR-10, trained using SGD in a standard way.