• Corpus ID: 235670078

Continuous Latent Process Flows

@inproceedings{Deng2021ContinuousLP,
  title={Continuous Latent Process Flows},
  author={Ruizhi Deng and Marcus A. Brubaker and Greg Mori and Andreas M. Lehrmann},
  booktitle={NeurIPS},
  year={2021}
}
Partial observations of continuous time-series dynamics at arbitrary time stamps exist in many disciplines. Fitting this type of data using statistical models with continuous dynamics is not only promising at an intuitive level but also has practical benefits, including the ability to generate continuous trajectories and to perform inference on previously unseen time stamps. Despite exciting progress in this area, the existing models still face challenges in terms of their representation power… 
1 Citations

Figures and Tables from this paper

Modeling Irregular Time Series with Continuous Recurrent Units

TLDR
This work empiri-cally study the CRU on a number of challenging datasets and finds that it can interpolate irregular time series better than methods based on neural ordinary differential equations.

References

SHOWING 1-10 OF 41 REFERENCES

Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows

TLDR
A novel type of normalizing flow driven by a differential deformation of the Wiener process is proposed, obtaining a rich time series model whose observable process inherits many of the appealing properties of its base process, such as efficient computation of likelihoods and marginals.

Neural ODE Processes

TLDR
By maintaining an adaptive data-dependent distribution over the underlying ODE, this model can successfully capture the dynamics of low-dimensional systems from just a few data-points and scale up to challenging high-dimensional time-series with unknown latent dynamics such as rotating MNIST digits.

Scalable Reversible Generative Models with Free-form Continuous Dynamics

TLDR
This paper uses Hutchinson’s trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on highdimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.

Neural Rough Differential Equations for Long Time Series

TLDR
By generalising the Neural CDE approach to a broader class of driving signals, this regime demonstrates efficacy on problems of length up to 17k observations and observe significant training speed-ups, improvements in model performance, and reduced memory requirements compared to existing approaches.

Latent Ordinary Differential Equations for Irregularly-Sampled Time Series

TLDR
This work generalizes RNNs to have continuous-time hidden dynamics defined by ordinary differential equations (ODEs), a model they are called ODE-RNNs, which outperform their RNN-based counterparts on irregularly-sampled data.

Improved Variational Inference with Inverse Autoregressive Flow

TLDR
A new type of normalizing flow, inverse autoregressive flow (IAF), is proposed that, in contrast to earlier published flows, scales well to high-dimensional latent spaces and significantly improves upon diagonal Gaussian approximate posteriors.

Neural Ordinary Differential Equations

TLDR
This work shows how to scalably backpropagate through any ODE solver, without access to its internal operations, which allows end-to-end training of ODEs within larger models.

Residual Flows for Invertible Generative Modeling

TLDR
The resulting approach, called Residual Flows, achieves state-of-the-art performance on density estimation amongst flow-based models, and outperforms networks that use coupling blocks at joint generative and discriminative modeling.

Gaussian Process Approximations of Stochastic Differential Equations

TLDR
A novel Gaussian process approximation to the posterior measure over paths for a general class of stochastic differential equations in the presence of observations is presented, and the results are very promising as the variational approximate solution outperforms standardGaussian process regression for non-Gaussian Markov processes.

Neural Controlled Differential Equations for Irregular Time Series

TLDR
The resultingural controlled differential equation model is directly applicable to the general setting of partially-observed irregularly-sampled multivariate time series, and (unlike previous work on this problem) it may utilise memory-efficient adjoint-based backpropagation even across observations.