• Corpus ID: 211296419

Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows

@article{Deng2020ModelingCS,
  title={Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows},
  author={Ruizhi Deng and B. Chang and Marcus A. Brubaker and Greg Mori and Andreas M. Lehrmann},
  journal={ArXiv},
  year={2020},
  volume={abs/2002.10516}
}
Normalizing flows transform a simple base distribution into a complex target distribution and have proved to be powerful models for data generation and density estimation. In this work, we propose a novel type of normalizing flow driven by a differential deformation of the Wiener process. As a result, we obtain a rich time series model whose observable process inherits many of the appealing properties of its base process, such as efficient computation of likelihoods and marginals. Furthermore… 

Figures and Tables from this paper

Continuous Latent Process Flows
TLDR
CLPF is a principled architecture decoding continuous latent processes into continuous observable processes using a time-dependent normalizing flow driven by a stochastic differential equation and a novel piecewise construction of a variational posterior process that derives the corresponding variational lower bound using importance weighting of trajectories.
Neural Spatio-Temporal Point Processes
We propose a new class of parameterizations for spatio-temporal point processes which leverage Neural ODEs as a computational method and enable flexible, high-fidelity models of discrete events that
Neural SDEs Made Easy: SDEs are Infinite-Dimensional GANS
Several authors have introduced Neural Stochastic Differential Equations (Neural SDEs), often involving complex theory with various limitations. Here, we aim to introduce a generic, user friendly
Neural ODE Processes
TLDR
By maintaining an adaptive data-dependent distribution over the underlying ODE, this model can successfully capture the dynamics of low-dimensional systems from just a few data-points and scale up to challenging high-dimensional time-series with unknown latent dynamics such as rotating MNIST digits.
Efficient and Accurate Gradients for Neural SDEs
TLDR
The reversible Heun method is introduced, a new SDE solver that is algebraically reversible: eliminating numerical gradient errors, and the first such solver of which the author is aware.
Neural Controlled Differential Equations for Irregular Time Series
TLDR
The resultingural controlled differential equation model is directly applicable to the general setting of partially-observed irregularly-sampled multivariate time series, and (unlike previous work on this problem) it may utilise memory-efficient adjoint-based backpropagation even across observations.
Agent Forecasting at Flexible Horizons using ODE Flows
TLDR
OMEN’s architecture embeds an assumption that marginal distributions of a given agent moving forward in time are related, allowing for an efficient representation of marginal distributions through time and allowing for reliable interpolation between prediction horizons seen in training.
Time Series Data Augmentation for Deep Learning: A Survey
TLDR
This paper systematically review different data augmentation methods for time series, and proposes a taxonomy for the reviewed methods, and provides a structured review for these methods by highlighting their strengths and limitations.
Neural SDEs as Infinite-Dimensional GANs
TLDR
This work shows that the current classical approach to fitting SDEs may be approached as a special case of (Wasserstein) GANs, and in doing so the neural and classical regimes may be brought together.
MIMO-GAN: Generative MIMO Channel Modeling
TLDR
This work uses advances in generative adversarial network (GAN) to learn an implicit distribution over stochastic MIMO channels from observed measurements, and implicitly models the wireless channel as a distribution of time-domain band-limited impulse responses.
...
...

References

SHOWING 1-10 OF 57 REFERENCES
Scalable Reversible Generative Models with Free-form Continuous Dynamics
TLDR
This paper uses Hutchinson’s trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on highdimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.
Improved Variational Inference with Inverse Autoregressive Flow
TLDR
A new type of normalizing flow, inverse autoregressive flow (IAF), is proposed that, in contrast to earlier published flows, scales well to high-dimensional latent spaces and significantly improves upon diagonal Gaussian approximate posteriors.
Intensity-Free Learning of Temporal Point Processes
TLDR
A simple mixture model is proposed that matches the flexibility of flow-based models, but also permits sampling and computing moments in closed form and is suitable for novel applications, such as learning sequence embeddings and imputing missing data.
Normalizing Flows for Probabilistic Modeling and Inference
TLDR
This review places special emphasis on the fundamental principles of flow design, and discusses foundational topics such as expressive power and computational trade-offs, and summarizes the use of flows for tasks such as generative modeling, approximate inference, and supervised learning.
Residual Flows for Invertible Generative Modeling
TLDR
The resulting approach, called Residual Flows, achieves state-of-the-art performance on density estimation amongst flow-based models, and outperforms networks that use coupling blocks at joint generative and discriminative modeling.
VideoFlow: A Conditional Flow-Based Model for Stochastic Video Generation
TLDR
This work is the first to propose multi-frame video prediction with normalizing flows, which allows for direct optimization of the data likelihood, and produces high-quality stochastic predictions.
A Neural Stochastic Volatility Model
TLDR
This paper shows that the recent integration of statistical models with deep recurrent neural networks provides a new way of formulating volatility models that outperforms mainstream methods, e.g., deterministic models such as GARCH and its variants, and stochastic models namely the MCMC-based model \emph{stochvol}.
Auto-Encoding Variational Bayes
TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
Variational Inference with Normalizing Flows
TLDR
It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.
Nonparametric Bayesian Learning of Switching Linear Dynamical Systems
TLDR
This work develops a sampling algorithm that combines a truncated approximation to the Dirichlet process with efficient joint sampling of the mode and state sequences in an unknown number of persistent, smooth dynamical modes.
...
...