Disentangling a Deep Learned Volume Formula

@article{Craven2020DisentanglingAD,
  title={Disentangling a Deep Learned Volume Formula},
  author={Jessica Craven and Vishnu Jejjala and Arjun Kar},
  journal={ArXiv},
  year={2020},
  volume={abs/2012.03955}
}
We present a simple phenomenological formula which approximates the hyperbolic volume of a knot using only a single evaluation of its Jones polynomial at a root of unity. The average error is just 2.86% on the first 1.7 million knots, which represents a large improvement over previous formulas of this kind. To find the approximation formula, we use layer-wise relevance propagation to reverse engineer a black box neural network which achieves a similar average error for the same approximation… 

Machine-Learning Mathematical Structures

  • Yang-Hui He
  • Computer Science
    International Journal of Data Science in the Mathematical Sciences
  • 2022
TLDR
Focusing on supervised machine-learning on labeled data from different fields ranging from geometry to representation theory, from combinatorics to number theory, a comparative study of the accuracies on different problems is presented.

Neural Network Approximations for Calabi-Yau Metrics

TLDR
This work employs techniques from machine learning to deduce numerical flat metrics for K3, the Fermat quintic, and the Dwork quintic by employing a simple, modular neural network architecture capable of approximating Ricci flat Kähler metrics for Calabi-Yau manifolds of dimensions two and three.

Narrowing the Gap between Combinatorial and Hyperbolic Knot Invariants via Deep Learning

TLDR
A statistical approach is presented for the discovery of relationships between mathematical entities that is based on linear regression and deep learning with fully connected artificial neural networks and empirical connections between combinatorial and hyperbolic knot invariants are revealed.

Learning knot invariants across dimensions

TLDR
It is found that a two-layer feed-forward neural network can predict s from Kh(q,−q−4) with greater than 99% accuracy, which suggests a novel relationship between the Khovanov and Lee homology theories of a knot.

Creating simple, interpretable anomaly detectors for new physics in jet substructure

TLDR
This work proposes two strategies that use a small number of high-level observables to mimic the decisions made by the autoencoder on background events, one designed to directly learn the output of the authencoder, and the other designed to learn the difference between the aut Koencoder’s outputs on a pair of events.

(K)not machine learning

TLDR
The goal of this work is to translate numerical experiments with Big Data to new analytic results in aspects of Chern–Simons theory and higher dimensional gauge theories.

Machine Learning Kreuzer-Skarke Calabi-Yau Threefolds

TLDR
Using a fully connected feedforward neural network, the existence of a simple expression for the Euler number is found that can be learned in terms of limited data extracted from the polytope and its dual.

References

SHOWING 1-10 OF 76 REFERENCES

Statistical Predictions in String Theory and Deep Generative Models

TLDR
It is demonstrated in a large ensemble of Calabi‐Yau manifolds that Kähler metrics evaluated at points in Köhler moduli space are well‐approximated by ensembles of matrices produced by a deep convolutional Wasserstein GAN.

Towards Novel Insights in Lattice Field Theory with Explainable Machine Learning

TLDR
This work investigates action parameter regression as a pretext task while using layer-wise relevance propagation (LRP) to identify the most important observables depending on the location in the phase diagram and argues that due to its broad applicability, attribution methods such as LRP could prove a useful and versatile tool in the search for new physical insights.

Deep-Learning the Landscape

We propose a paradigm to deep-learn the ever-expanding databases which have emerged in mathematical physics and particle phenomenology, as diverse as the statistics of string vacua or combinatorial

Discovering Symbolic Models from Deep Learning with Inductive Biases

TLDR
The correct known equations, including force laws and Hamiltonians, can be extracted from the neural network and a new analytic formula is discovered which can predict the concentration of dark matter from the mass distribution of nearby cosmic structures.

Branes with brains: exploring string vacua with deep reinforcement learning

TLDR
An artificial intelligence agent known as an asynchronous advantage actor-critic is utilized to explore type IIA compactifications with intersecting D6-branes to solve various string theory consistency conditions simultaneously, phrased in terms of non-linear, coupled Diophantine equations.

Learning to unknot

TLDR
The UNKNOT problem of determining whether or not a given knot is the unknot is studied, and it is found that accuracy increases with the length of the braid word, and that the networks learn a direct correlation between the confidence of their predictions and the degree of the Jones polynomial.

Machine learning and algebraic approaches towards complete matter spectra in 4d F-theory

TLDR
A diagrammatic way to express cohomology jumps across the parameter space of each family of matter curves, which reflects a stratification of the F-theory complex structure moduli space in terms of the vector-like spectrum.

A neural network approach to predicting and computing knot invariants

  • M. Hughes
  • Computer Science, Mathematics
    Journal of Knot Theory and Its Ramifications
  • 2020
TLDR
Artificial neural networks are shown to be able to predict when a knot is quasipositive with a high degree of accuracy and to predict the slice genus and Ozsváth-Szabó]-invariant of knots.

THE COLORED JONES POLYNOMIAL AND THE AJ CONJECTURE par

...