Group-invariant tensor train networks for supervised learning

@article{Sprangers2022GroupinvariantTT,
  title={Group-invariant tensor train networks for supervised learning},
  author={Brent Sprangers and Nick Vannieuwenhoven},
  journal={ArXiv},
  year={2022},
  volume={abs/2206.15051}
}
. Invariance has recently proven to be a powerful inductive bias in machine learning models. One such class of predictive or generative models are tensor networks. We introduce a new numerical algorithm to construct a basis of tensors that are invariant under the action of normal matrix representations of an arbitrary discrete group. This method can be up to several orders of magnitude faster than previous approaches. The group-invariant tensors are then combined into a group-invariant tensor… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 38 REFERENCES

Supervised Learning with Quantum-Inspired Tensor Networks

It is demonstrated how algorithms for optimizing such networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize models for classifying images.

Segmenting two-dimensional structures with strided tensor networks

This work proposes a novel formulation of tensor networks for supervised image segmentation which allows them to operate on high resolution medical images and uses the matrix product state (MPS) tensor network on non-overlapping patches of a given input image to predict the segmentation mask.

Tree Tensor Networks for Generative Modeling

It is shown that the TTN is superior to MPSs for generative modeling in keeping the correlation of pixels in natural images, as well as giving better log-likelihood scores in standard data sets of handwritten digits.

Tensor-Train Decomposition

The new form gives a clear and convenient way to implement all basic operations efficiently, and the efficiency is demonstrated by the computation of the smallest eigenvalue of a 19-dimensional operator.

Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges

A 'geometric unification' endeavour that provides a common mathematical framework to study the most successful neural network architectures, such as CNNs, RNN's, GNNs, and Transformers, and gives a constructive procedure to incorporate prior physical knowledge into neural architectures and provide principled way to build future architectures yet to be invented.

A Practical Method for Constructing Equivariant Multilayer Perceptrons for Arbitrary Matrix Groups

This work provides a completely general algorithm for solving for the equivariant layers of matrix groups and constructs multilayer perceptrons equivariants to multiple groups that have never been tackled before, including the Rubik’s cube group.

Global symmetries in tensor network states: Symmetric tensors versus minimal bond dimension

It is argued that in some situations there are important conceptual reasons to prefer a tensor network representation with symmetric tensors (and possibly larger bond dimension) over one with minimal bond dimension.

Hand-waving and Interpretive Dance: An Introductory Course on Tensor Networks

The curse of dimensionality associated with the Hilbert space of spin systems provides a significant obstruction to the study of condensed matter systems. Tensor networks have proven an important

Algorithm 862: MATLAB tensor classes for fast algorithm prototyping

Four MATLAB classes for tensor manipulations that can be used for fast algorithm prototyping are described and their use is demonstrated by showing how to implement several tensor algorithms that have appeared in the literature.

rTensor: An R Package for Multidimensional Array (Tensor) Unfolding, Multiplication, and Decomposition

An S4 class that wraps around the base 'array' class and overloads familiar operations to users of 'array', and additional functionality for tensor operations that are becoming more relevant in recent literature are provided.