# Neural Symplectic Integrator with Hamiltonian Inductive Bias for the Gravitational $N$-body Problem

@article{Cai2021NeuralSI, title={Neural Symplectic Integrator with Hamiltonian Inductive Bias for the Gravitational \$N\$-body Problem}, author={Maxwell Xu Cai and Simon Portegies Zwart and Damian Podareanu}, journal={ArXiv}, year={2021}, volume={abs/2111.15631} }

The gravitational N -body problem, which is fundamentally important in astrophysics to predict the motion ofN celestial bodies under the mutual gravity of each other, is usually solved numerically because there is no known general analytical solution for N > 2. Can an N -body problem be solved accurately by a neural network (NN)? Can a NN observe long-term conservation of energy and orbital angular momentum? Inspired by Wistom & Holman’s symplectic map, we present a neural N -body integrator…

## One Citation

### Learning Neural Hamiltonian Dynamics: A Methodological Overview

- Computer ScienceArXiv
- 2022

This paper systematically survey recently proposed Hamiltonian neural network models, with a special emphasis on methodologies, and discusses the major contributions of these models in four overlapping directions.

## References

SHOWING 1-10 OF 23 REFERENCES

### Symplectic maps for the N-body problem.

- Physics, Geology
- 1991

The present study generalizes the mapping method of Wisdom (1982) to encompass all gravitational n-body problems with a dominant central mass. The rationale for the generalized mapping method is…

### Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations

- Computer ScienceJ. Comput. Phys.
- 2019

### Symplectic ODE-Net: Learning Hamiltonian Dynamics with Control

- Computer ScienceICLR
- 2020

This paper introduces Symplectic ODE-Net, a deep learning framework which can infer the dynamics of a physical system, given by an ordinary differential equation (ODE), from observed state trajectories and proposes a parametrization which can enforce this Hamiltonian formalism even when the generalized coordinate data is embedded in a high-dimensional space or the authors can only access velocity data instead of generalized momentum.

### Hamiltonian Neural Networks

- Physics, Computer ScienceNeurIPS
- 2019

Inspiration from Hamiltonian mechanics is drawn to train models that learn and respect exact conservation laws in an unsupervised manner, and this model trains faster and generalizes better than a regular neural network.

### The Eccentric Kozai-Lidov Effect and Its Applications

- Physics, Geology
- 2016

The hierarchical triple-body approximation has useful applications to a variety of systems from planetary and stellar scales to supermassive black holes. In this approximation, the energy of each…

### Physics Informed Deep Learning (Part I): Data-driven Solutions of Nonlinear Partial Differential Equations

- Computer ScienceArXiv
- 2017

This two part treatise introduces physics informed neural networks – neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial differential equations and demonstrates how these networks can be used to infer solutions topartial differential equations, and obtain physics-informed surrogate models that are fully differentiable with respect to all input coordinates and free parameters.

### Physics Informed Deep Learning (Part II): Data-driven Discovery of Nonlinear Partial Differential Equations

- Computer ScienceArXiv
- 2017

We introduce physics informed neural networks -- neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial…

### HIGAN: Cosmic Neutral Hydrogen with Generative Adversarial Networks

- PhysicsArXiv
- 2019

This work uses Wasserstein Generative Adversarial Networks (WGANs) to generate new high-resolution 3D realizations of cosmic HI at $z=5, and samples reproduce the abundance of HI across 9 orders of magnitude, from the Ly$\alpha$ forest to Damped Lyman Absorbers.

### Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science

- Computer ScienceNature Communications
- 2018

A method to design neural networks as sparse scale-free networks, which leads to a reduction in computational time required for training and inference, which has the potential to enable artificial neural networks to scale up beyond what is currently possible.

### Evolving and Understanding Sparse Deep Neural Networks using Cosine Similarity

- Computer ScienceArXiv
- 2019

This work proposes a novel approach that evolves a sparse network topology based on the behavior of neurons in the network, using the cosine similarities between the activations of any two neurons to determine which connections are added or removed from the network.