Detecting symmetries with neural networks

@article{Krippendorf2021DetectingSW,
  title={Detecting symmetries with neural networks},
  author={Sven Krippendorf and Marc Syvaeri},
  journal={Machine Learning: Science and Technology},
  year={2021},
  volume={2}
}
Identifying symmetries in data sets is generally difficult, but knowledge about them is crucial for efficient data handling. Here we present a method how neural networks can be used to identify symmetries. We make extensive use of the structure in the embedding layer of the neural network which allows us to identify whether a symmetry is present and to identify orbits of the symmetry in the input. To determine which continuous or discrete symmetry group is present we analyse the invariant… 

Symmetry discovery with deep learning

What are the symmetries of a dataset? Whereas the symmetries of an individual data element can be characterized by its invariance under various transformations, the symmetries of an ensemble of data

Symmetry-via-Duality: Invariant Neural Network Densities from Parameter-Space Correlators

TLDR
It is demonstrated that the amount of symmetry in the initialization density affects the accuracy of networks trained on Fashion-MNIST, and that symmetry breaking helps only when it is in the direction of ground truth.

Symmetry meets AI

TLDR
An interdisciplinary application of this procedure identifies the presence and level of symmetry in artistic paintings from different styles such as those of Picasso, Pollock and Van Gogh.

Human Symmetry Uncertainty Detected by a Self-Organizing Neural Network Map

TLDR
An artificial neural network is presented that detects symmetry uncertainty states in human observers and is tightly linked to the metric’s proven selectivity to local contrast and color variations in large and highly complex image data.

Symmetries, safety, and self-supervision

Collider searches face the challenge of defining a representation of high-dimensional data such that physical symmetries are manifest, the discriminating features are retained, and the choice of

Machine learning Calabi-Yau four-folds

Learning Equivariant Representations

TLDR
This thesis proposes equivariant models for different transformations defined by groups of symmetries, and extends equivariance to other kinds of transformations, such as rotation and scaling.

Cluster Algebras: Network Science and Machine Learning

TLDR
Network analysis methods are applied to the exchange graphs for cluster algebras of varying mutation types and indicates that when the graphs are represented without identifying by permutation equivalence between clusters an elegant symmetry emerges in the quiver exchange graph embedding.

Inverse Problems, Deep Learning, and Symmetry Breaking

TLDR
This work shows that careful symmetry breaking on the training data can help get rid of the difficulties and significantly improve the learning performance of the generalized phase retrieval problem.

Machine-Learning Mathematical Structures

  • Yang-Hui He
  • Computer Science
    International Journal of Data Science in the Mathematical Sciences
  • 2022
TLDR
Focusing on supervised machine-learning on labeled data from different fields ranging from geometry to representation theory, from combinatorics to number theory, a comparative study of the accuracies on different problems is presented.

References

SHOWING 1-10 OF 27 REFERENCES

Discovering Symmetry Invariants and Conserved Quantities by Interpreting Siamese Neural Networks

In this paper, we introduce interpretable Siamese Neural Networks (SNN) for similarity detection to the field of theoretical physics. More precisely, we apply SNNs to events in special relativity,

Visualizing Data using t-SNE

TLDR
A new technique called t-SNE that visualizes high-dimensional data by giving each datapoint a location in a two or three-dimensional map, a variation of Stochastic Neighbor Embedding that is much easier to optimize, and produces significantly better visualizations by reducing the tendency to crowd points together in the center of the map.

Machine learning CICY threefolds

Machine Learning Line Bundle Cohomology

We investigate different approaches to machine learning of line bundle cohomology on complex surfaces as well as on Calabi‐Yau three‐folds. Standard function learning based on simple fully connected

Getting CICY high

Evolving neural networks with genetic algorithms to study the string landscape

TLDR
Three areas in which neural networks can be applied are studied: to classify models according to a fixed set of (physically) appealing features, to find a concrete realization for a computation for which the precise algorithm is known in principle but very tedious to actually implement, and to predict or approximate the outcome of some involved mathematical computation which performs too inefficient to apply it.

Learning atoms for materials discovery

TLDR
The unsupervised machines (Atom2Vec) can learn the basic properties of atoms by themselves from the extensive database of known compounds and materials, represented in terms of high-dimensional vectors, and clustering of atoms in vector space classifies them into meaningful groups consistent with human knowledge.

Heterotic line bundle standard models

A bstractIn a previous publication, arXiv:1106.4804, we have found 200 models from heterotic Calabi-Yau compactifications with line bundles, which lead to standard models after taking appropriate