Learn one size to infer all: Exploiting translational symmetries in delay-dynamical and spatiotemporal systems using scalable neural networks.

  title={Learn one size to infer all: Exploiting translational symmetries in delay-dynamical and spatiotemporal systems using scalable neural networks.},
  author={Mirko Goldmann and Claudio R. Mirasso and Ingo Fischer and Miguel C. Soriano},
  journal={Physical review. E},
  volume={106 4-1},
We design scalable neural networks adapted to translational symmetries in dynamical systems, capable of inferring untrained high-dimensional dynamics for different system sizes. We train these networks to predict the dynamics of delay-dynamical and spatiotemporal systems for a single size. Then, we drive the networks by their own predictions. We demonstrate that by scaling the size of the trained network, we can predict the complex dynamics for larger or smaller system sizes. Thus, the network… 



Deep Learning

Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.

The''echo state''approach to analysing and training recurrent neural networks

The report introduces a constructive learning algorithm for recurrent neural networks, which modifies only the weights to output units in order to achieve the learning task. key words: recurrent

Learning Spatiotemporal Chaos Using Next-Generation Reservoir Computing

S spatiotemporal chaos prediction is demonstrated using a machine learning architecture that, when combined with a next-generation reservoir computer, displays state-of-the-art performance with a computational time 10- 10 times faster for training process and training data set ∼ 10 times smaller than other machine learning algorithms.

Symmetry-Based Representations for Artificial and Biological General Intelligence

It is argued that symmetry transformations are a fundamental principle that can guide the search for what makes a good representation, and may be an important general framework that determines the structure of the universe, constrains the nature of natural tasks and consequently shapes both biological and artificial intelligence.

Biologically informed deep neural network for prostate cancer discovery

It is demonstrated that P-NET can predict cancer state using molecular data with a performance that is superior to other modelling approaches and revealed established and novel molecularly altered candidates, such as MDM4 and FGFR1, which were implicated in predicting advanced disease and validated in vitro.

Machine-learning hidden symmetries

An automated method is presented that rediscovers the famous Gullstrand-Painlevé metric that manifests hidden translational symmetry in the Schwarzschild metric of non-rotating black holes, as well as Hamiltonicity, modularity and other simplifying traits not traditionally viewed as symmetries.

Parallel Machine Learning for Forecasting the Dynamics of Complex Networks

The utility and scalability of the method implemented using reservoir computing on a chaotic network of oscillators, using a parallel architecture that mimics the topology of the network of interest, are demonstrated.

Model-free inference of unseen attractors: Reconstructing phase space features from a single noisy trajectory using reservoir computing

The ability to learn the dynamics of a complex system can be extended to systems with multiple co-existing attractors, here a four-dimensional extension of the well-known Lorenz chaotic system.

Physics-informed machine learning

Some of the prevailing trends in embedding physics into machine learning are reviewed, some of the current capabilities and limitations are presented and diverse applications of physics-informed learning both for forward and inverse problems, including discovering hidden physics and tackling high-dimensional problems are discussed.

Emergence of transient chaos and intermittency in machine learning

This paper reports the results from a detailed study of the statistical behaviors of transient chaos generated by the parameter-aware reservoir computing machine, and demonstrates that the machine learning framework can reproduce intermittency of the target system.