• Corpus ID: 209414683

Temporal Normalizing Flows

@article{Kusters2019TemporalNF,
  title={Temporal Normalizing Flows},
  author={Remy Kusters and Gert-Jan Both},
  journal={arXiv: Computational Physics},
  year={2019}
}
  • R. Kusters, G. Both
  • Published 19 December 2019
  • Computer Science
  • arXiv: Computational Physics
Analyzing and interpreting time-dependent stochastic data requires accurate and robust density estimation. In this paper we extend the concept of normalizing flows to so-called temporal Normalizing Flows (tNFs) to estimate time dependent distributions, leveraging the full spatio-temporal information present in the dataset. Our approach is unsupervised, does not require an a-priori characteristic scale and can accurately estimate multi-scale distributions of vastly different length scales. We… 

Figures from this paper

Learning the temporal evolution of multivariate densities via normalizing flows
TLDR
This work proposes a method to learn multivariate probability distributions using sample path data from stochastic differential equations and demonstrates that this approach can approximate probability density function evolutions in time from observed sampled data for systems driven by both Brownian and Lévy noise.
Solving time dependent Fokker-Planck equations via temporal normalizing flow
TLDR
An adaptive learning approach based on temporal normalizing flows for solving time-dependent Fokker-Planck (TFP) equations is proposed, which is mesh-free and can be easily applied to high dimensional problems.

References

SHOWING 1-10 OF 26 REFERENCES
Masked Autoregressive Flow for Density Estimation
TLDR
This work describes an approach for increasing the flexibility of an autoregressive model, based on modelling the random numbers that the model uses internally when generating data, which is called Masked Autoregressive Flow.
Continuous-Time Flows for Efficient Inference and Density Estimation
TLDR
This paper proposes the concept of continuous-time flows (CTFs), a family of diffusion-based methods that are able to asymptotically approach a target distribution and demonstrates promising performance of the proposed CTF framework, compared to related techniques.
Inferring solutions of differential equations using noisy multi-fidelity data
Neural Autoregressive Flows
TLDR
It is demonstrated that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions, and their greater expressivity allows them to better capture multimodal target distributions.
A Review of Kernel Density Estimation with Applications to Econometrics
TLDR
This comprehensive review summarizes the most important theoretical aspects of kernel density estimation and provides an extensive description of classical and modern data analytic methods to compute the smoothing parameter.
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
TLDR
This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.
Kernel density estimation via diffusion
TLDR
A new adaptive kernel density estimator based on linear diffusion processes that builds on existing ideas for adaptive smoothing by incorporating information from a pilot density estimate and a new plug-in bandwidth selection method that is free from the arbitrary normal reference rules used by existing methods.
Nonparametric kernel density estimation near the boundary
Density estimation using Real NVP
TLDR
This work extends the space of probabilistic models using real-valued non-volume preserving (real NVP) transformations, a set of powerful invertible and learnable transformations, resulting in an unsupervised learning algorithm with exact log-likelihood computation, exact sampling, exact inference of latent variables, and an interpretable latent space.
A Disentangled Recognition and Nonlinear Dynamics Model for Unsupervised Learning
TLDR
The Kalman variational auto-encoder is introduced, a framework for unsupervised learning of sequential data that disentangles two latent representations: an object's representation, coming from a recognition model, and a latent state describing its dynamics.
...
1
2
3
...