• Corpus ID: 246485884

Neural graphical modelling in continuous-time: consistency guarantees and algorithms

@inproceedings{Bellot2021NeuralGM,
  title={Neural graphical modelling in continuous-time: consistency guarantees and algorithms},
  author={Alexis Bellot and Kim Branson and Mihaela van der Schaar},
  booktitle={International Conference on Learning Representations},
  year={2021}
}
The discovery of structure from time series data is a key problem in fields of study working with complex systems. Most identifiability results and learning algorithms assume the underlying dynamics to be discrete in time. Comparatively few, in contrast, explicitly define dependencies in infinitesimal intervals of time, independently of the scale of observation and of the regularity of sampling. In this paper, we consider score-based structure learning for the study of dynamical systems. We… 

Figures and Tables from this paper

Identifiability and Asymptotics in Learning Homogeneous Linear ODE Systems from Discrete Observations

This work derives a sufficient condition for the identifiability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory and proposes a new method to infer the causal structure of the ODE system, i.e., inferring whether there is a causal link between system variables.

Sparsity in Continuous-Depth Neural Networks

It is demonstrated that sparsity improves out-of-distribution generalization (for the types of OOD considered) of NODEs, and proposed PathReg, a regularizer acting directly on entire paths throughout a neural network and achieving exact zeros is proposed.

Bayesian Dynamic Causal Discovery

A new framework for Bayesian causal discovery for dynamical systems is proposed and a novel generative flow network architecture (Dyn-GFN) tailored for this task is presented, which imposes an edge-wise sparse prior to sequentially build a k -sparse causal graph.

Towards Better Long-range Time Series Forecasting using Generative Forecasting

A new forecasting strategy called Generative Forecasting (GenF), which generates synthetic data for the next few time steps and then makes long- range forecasts based on generated and observed data, is proposed.

Towards Better Long-range Time Series Forecasting using Generative Adversarial Networks

A new forecasting strategy called Generative Forecasting (GenF), which generates synthetic data for the next few time steps and then makes long-range forecasts based on generated and observed data, and theoretically proves that GenF is able to better balance the forecasting variance and bias, leading to a much smaller forecasting error.

References

SHOWING 1-10 OF 82 REFERENCES

Learning Networks of Stochastic Differential Equations

The $\ell_1$-regularized least squares algorithm is analyzed and it is proved that performance guarantees are uniform in the sampling rate as long as this is sufficiently high, substantiates the notion of a well defined `time complexity' for the network inference problem.

DYNOTEARS: Structure Learning from Time-Series Data

This work revisits the structure learning problem for dynamic Bayesian networks and proposes a method that simultaneously estimates contemporaneous and time-lagged relationships between variables in a time-series, using a recent algebraic result characterizing the acyclicity constraint as a smooth equality constraint.

Parameter estimation for differential equations: a generalized smoothing approach

A new method that uses noisy measurements on a subset of variables to estimate the parameters defining a system of non‐linear differential equations, based on a modification of data smoothing methods along with a generalization of profiled estimation is described.

Economy Statistical Recurrent Units For Inferring Nonlinear Granger Causality

This work makes a case that the network topology of Granger causal relations is directly inferrable from a structured sparse estimate of the internal parameters of the SRU networks trained to predict the processes’ time series measurements.

Learning unknown ODE models with Gaussian processes

This work proposes to learn non-linear, unknown differential functions from state observations using Gaussian process vector fields within the exact ODE formalism and demonstrates the model's capabilities to infer dynamics from sparse data and to simulate the system forward into future.

Learning Causal Structure from Undersampled Time Series

A framework and foundation for learning causal structure from this type of complex time series data is provided and an algorithm for inferring aspects of the causal structure at the \true" timescale from the causal structures learned from the undersampled data is developed.

Discovering governing equations from data by sparse identification of nonlinear dynamical systems

This work develops a novel framework to discover governing equations underlying a dynamical system simply from data measurements, leveraging advances in sparsity techniques and machine learning and using sparse regression to determine the fewest terms in the dynamic governing equations required to accurately represent the data.

Network Reconstruction From High-Dimensional Ordinary Differential Equations

It is shown that the proposed method can consistently recover the true network structure even in high dimensions, and it is demonstrated that the method can demonstrate empirical improvement over competing approaches.

Optimal rate of direct estimators in systems of ordinary differential equations linear in functions of the parameters

Many processes in biology, chemistry, physics, medicine, and engineering are modeled by a system of differential equations. Such a system is usually characterized via unknown parameters and

Neural Granger Causality for Nonlinear Time Series

This work proposes a class of nonlinear methods by applying structured multilayer perceptrons (MLPs) or recurrent neural networks (RNNs) combined with sparsity-inducing penalties on the weights to extract the Granger causal structure.
...