Parallel Machine Learning for Forecasting the Dynamics of Complex Networks

@article{Srinivasan2022ParallelML,
  title={Parallel Machine Learning for Forecasting the Dynamics of Complex Networks},
  author={Keshav Srinivasan and Nolan J. Coble and Joyce L. Hamlin and Thomas M. Antonsen and Edward Ott and Michelle Girvan},
  journal={Physical review letters},
  year={2022},
  volume={128 16},
  pages={
          164101
        }
}
Forecasting the dynamics of large, complex, sparse networks from previous time series data is important in a wide range of contexts. Here we present a machine learning scheme for this task using a parallel architecture that mimics the topology of the network of interest. We demonstrate the utility and scalability of our method implemented using reservoir computing on a chaotic network of oscillators. Two levels of prior knowledge are considered: (i) the network links are known, and (ii) the… 

Figures and Tables from this paper

Learn one size to infer all: Exploiting translational symmetries in delay-dynamical and spatiotemporal systems using scalable neural networks.

A scalable neural network adapted to translational symmetries in dynamical systems capable of inferring untrained high-dimensional dynamics for different system sizes is designed and it is demonstrated that by scaling the size of the trained network, it can predict the complex dynamics for larger or smaller system sizes.

Quantifying Interdependencies in Geyser Eruptions at the Upper Geyser Basin, Yellowstone National Park

The Upper Geyser Basin at Yellowstone National Park (Wyoming, USA) harbors the greatest concentration of geysers worldwide. Research suggests that individual geysers are not isolated but rather are

A Catch-22 of Reservoir Computing

This work shows that the performance of NGRC models can be extremely sensitive to the choice of readout nonlinearity, and highlights the challenges faced by data-driven methods in learning complex dynamical systems.

References

SHOWING 1-10 OF 28 REFERENCES

Rapid Time Series Prediction with a Hardware-Based Reservoir Computer

A reservoir computing scheme that has rapid processing speed both by the reservoir and the output layer, and the utility of the technique is demonstrated by training a reservoir to learn the short- and long-term behavior of a chaotic system.

Using Machine Learning to Assess Short Term Causal Dependence and Infer Network Links

It is seen that dynamical noise can greatly enhance the effectiveness of the technique, while observational noise degrades the effectiveness, and the competition between these two opposing types of noise will be the key factor determining the success of causal inference in many of the most important application situations.

Information Processing Capacity of Dynamical Systems

The theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis to define the computational capacity of a dynamical system.

Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication

We present a method for learning nonlinear systems, echo state networks (ESNs). ESNs employ artificial recurrent neural networks in a way that has recently been proposed independently as a learning

Model-free inference of direct network interactions from nonlinear collective dynamics

A model-independent framework for inferring direct interactions solely from recording the nonlinear collective dynamics generated from observed time series without relying on their model is developed.

Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations

A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.

Link Prediction in Complex Networks: A Mutual Information Perspective

This paper reexamine the role of network topology in predicting missing links from the perspective of information theory, and presents a practical approach based on the mutual information of network structures that can improve the prediction accuracy substantially and experiences reasonable computing complexity.

The''echo state''approach to analysing and training recurrent neural networks

The report introduces a constructive learning algorithm for recurrent neural networks, which modifies only the weights to output units in order to achieve the learning task. key words: recurrent

Role-similarity based functional prediction in networked systems: application to the yeast proteome

  • P. HolmeM. Huss
  • Computer Science
    Journal of The Royal Society Interface
  • 2005
We propose a general method to predict functions of vertices where (i) the wiring of the network is somehow related to the vertex functionality and (ii) a fraction of the vertices are functionally