Machine-Learning Non-Conservative Dynamics for New-Physics Detection

@article{Liu2021MachineLearningND,
  title={Machine-Learning Non-Conservative Dynamics for New-Physics Detection},
  author={Ziming Liu and Bohan Wang and Qi Meng and Wei Chen and Max Tegmark and Tie-Yan Liu},
  journal={ArXiv},
  year={2021},
  volume={abs/2106.00026}
}
Energy conservation is a basic physics principle, the breakdown of which often implies new physics. This paper presents a method for data-driven “new physics” discovery. Specifically, given a trajectory governed by unknown forces, our Neural New-Physics Detector (NNPhD) aims to detect new physics by decomposing the force field into conservative and non-conservative components, which are represented by a Lagrangian Neural Network (LNN) and a universal approximator network (UAN), respectively… Expand

Figures and Tables from this paper

Extracting Dynamical Models from Data
TLDR
The FJet approach is introduced for determining the underlying model of a dynamical system, which naturally overcomes the "extrapolation problem", by doing the modeling in the phase space of the system, rather than over the time domain. Expand
Equivariant vector field network for many-body system modeling
  • Weitao Du, He Zhang, +4 authors Tie-Yan Liu
  • Computer Science, Mathematics
  • ArXiv
  • 2021
TLDR
The Equivariant Vector Field Network (EVFN), which is built on a novel tuple of equivariant basis and the associated scalarization and vectorization layers, is proposed, which achieves best or competitive performance on baseline models in various types of datasets. Expand

References

SHOWING 1-10 OF 30 REFERENCES
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
Abstract We introduce physics-informed neural networks – neural networks that are trained to solve supervised learning tasks while respecting any given laws of physics described by general nonlinearExpand
Toward an artificial intelligence physicist for unsupervised learning.
TLDR
This work proposes a paradigm centered around the learning and manipulation of theories, which parsimoniously predict both aspects of the future and the domain in which these predictions are accurate, and proposes a generalized mean loss to encourage each theory to specialize in its comparatively advantageous domain. Expand
Galileo: Perceiving Physical Object Properties by Integrating a Physics Engine with Deep Learning
TLDR
This study points towards an account of human vision with generative physical knowledge at its core, and various recognition models as helpers leading to efficient inference. Expand
The basic physics of the binary black hole merger GW150914
The first direct gravitational-wave detection was made by the Advanced Laser Interferometer Gravitational Wave Observatory on September 14, 2015. The GW150914 signal was strong enough to be apparent,Expand
Forecasting Hamiltonian dynamics without canonical coordinates
TLDR
This work prepend a conventional neural network to a Hamiltonian neural network and shows that the combination accurately forecasts Hamiltonian dynamics from generalised noncanonical coordinates. Expand
Simplifying Hamiltonian and Lagrangian Neural Networks via Explicit Constraints
TLDR
This paper introduces a series of challenging chaotic and extended-body systems, including systems with N-pendulums, spring coupling, magnetic fields, rigid rotors, and gyroscopes, and shows that embedding the system into Cartesian coordinates and enforcing the constraints explicitly with Lagrange multipliers dramatically simplifies the learning problem. Expand
Extracting Interpretable Physical Parameters from Spatiotemporal Systems using Unsupervised Learning
TLDR
This work implements a physics-informed architecture based on variational autoencoders that is designed for analyzing systems governed by partial differential equations (PDEs) and extracts latent parameters that parameterize the dynamics of a learned predictive model for the system. Expand
Why Does Deep and Cheap Learning Work So Well?
TLDR
It is argued that when the statistical process generating the data is of a certain hierarchical form prevalent in physics and machine learning, a deep neural network can be more efficient than a shallow one. Expand
AI Poincaré: Machine Learning Conservation Laws from Trajectories
TLDR
AI Poincaré, a machine learning algorithm for autodiscovering conserved quantities using trajectory data from unknown dynamical systems, is presented and it is found that it discovers not only all exactly conserved quantity, but also periodic orbits, phase transitions, and breakdown timescales for approximate conservation laws. Expand
Hamiltonian Neural Networks
TLDR
Inspiration from Hamiltonian mechanics is drawn to train models that learn and respect exact conservation laws in an unsupervised manner, and this model trains faster and generalizes better than a regular neural network. Expand
...
1
2
3
...