• Corpus ID: 246485884

# Neural graphical modelling in continuous-time: consistency guarantees and algorithms

@inproceedings{Bellot2021NeuralGM,
title={Neural graphical modelling in continuous-time: consistency guarantees and algorithms},
author={Alexis Bellot and Kim Branson and Mihaela van der Schaar},
booktitle={International Conference on Learning Representations},
year={2021}
}
• Published in
International Conference on…
6 May 2021
• Computer Science, Mathematics
The discovery of structure from time series data is a key problem in fields of study working with complex systems. Most identifiability results and learning algorithms assume the underlying dynamics to be discrete in time. Comparatively few, in contrast, explicitly define dependencies in infinitesimal intervals of time, independently of the scale of observation and of the regularity of sampling. In this paper, we consider score-based structure learning for the study of dynamical systems. We…

## Figures and Tables from this paper

• Mathematics, Computer Science
ArXiv
• 2022
This work derives a suﬃcient condition for the identiﬁability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory and proposes a new method to infer the causal structure of the ODE system, i.e., inferring whether there is a causal link between system variables.
• Computer Science
ArXiv
• 2022
It is demonstrated that sparsity improves out-of-distribution generalization (for the types of OOD considered) of NODEs, and proposed PathReg, a regularizer acting directly on entire paths throughout a neural network and achieving exact zeros is proposed.
• Computer Science
• 2022
A new framework for Bayesian causal discovery for dynamical systems is proposed and a novel generative flow network architecture (Dyn-GFN) tailored for this task is presented, which imposes an edge-wise sparse prior to sequentially build a k -sparse causal graph.
• Computer Science
ArXiv
• 2022
A new forecasting strategy called Generative Forecasting (GenF), which generates synthetic data for the next few time steps and then makes long- range forecasts based on generated and observed data, is proposed.
• Computer Science
ArXiv
• 2021
A new forecasting strategy called Generative Forecasting (GenF), which generates synthetic data for the next few time steps and then makes long-range forecasts based on generated and observed data, and theoretically proves that GenF is able to better balance the forecasting variance and bias, leading to a much smaller forecasting error.

## References

SHOWING 1-10 OF 82 REFERENCES

• Computer Science, Mathematics
NIPS
• 2010
The $\ell_1$-regularized least squares algorithm is analyzed and it is proved that performance guarantees are uniform in the sampling rate as long as this is sufficiently high, substantiates the notion of a well defined `time complexity' for the network inference problem.
• Computer Science
AISTATS
• 2020
This work revisits the structure learning problem for dynamic Bayesian networks and proposes a method that simultaneously estimates contemporaneous and time-lagged relationships between variables in a time-series, using a recent algebraic result characterizing the acyclicity constraint as a smooth equality constraint.
• Mathematics, Computer Science
• 2006
A new method that uses noisy measurements on a subset of variables to estimate the parameters defining a system of non‐linear differential equations, based on a modification of data smoothing methods along with a generalization of profiled estimation is described.
• Computer Science
ICLR
• 2020
This work makes a case that the network topology of Granger causal relations is directly inferrable from a structured sparse estimate of the internal parameters of the SRU networks trained to predict the processes’ time series measurements.
• Computer Science
ICML
• 2018
This work proposes to learn non-linear, unknown differential functions from state observations using Gaussian process vector fields within the exact ODE formalism and demonstrates the model's capabilities to infer dynamics from sparse data and to simulate the system forward into future.
• Computer Science
• 2013
A framework and foundation for learning causal structure from this type of complex time series data is provided and an algorithm for inferring aspects of the causal structure at the \true" timescale from the causal structures learned from the undersampled data is developed.
• Computer Science
Proceedings of the National Academy of Sciences
• 2016
This work develops a novel framework to discover governing equations underlying a dynamical system simply from data measurements, leveraging advances in sparsity techniques and machine learning and using sparse regression to determine the fewest terms in the dynamic governing equations required to accurately represent the data.
• Computer Science, Mathematics
Journal of the American Statistical Association
• 2017
It is shown that the proposed method can consistently recover the true network structure even in high dimensions, and it is demonstrated that the method can demonstrate empirical improvement over competing approaches.
• Mathematics
• 2013
Many processes in biology, chemistry, physics, medicine, and engineering are modeled by a system of differential equations. Such a system is usually characterized via unknown parameters and
• Computer Science
• 2018
This work proposes a class of nonlinear methods by applying structured multilayer perceptrons (MLPs) or recurrent neural networks (RNNs) combined with sparsity-inducing penalties on the weights to extract the Granger causal structure.