• Corpus ID: 239049743

Variational Predictive Routing with Nested Subjective Timescales

  title={Variational Predictive Routing with Nested Subjective Timescales},
  author={Alexey Zakharov and Qinghai Guo and Z. Fountas},
Discovery and learning of an underlying spatiotemporal hierarchy in sequential data is an important topic for machine learning. Despite this, little work has been done to explore hierarchical generative models that can flexibly adapt their layerwise representations in response to datasets with different temporal dynamics. Here, we present Variational Predictive Routing (VPR) – a neural probabilistic inference system that organizes latent representations of video features in a temporal hierarchy… 
1 Citations
Bayesian sense of time in biological and artificial brains
Enquiries concerning the underlying mechanisms and the emergent properties of a biological brain have a long history of theoretical postulates and experimental findings. Today, the scientific


Variational Temporal Abstraction
The Variational Temporal Abstraction (VTA) is proposed, a hierarchical recurrent state space model that can infer the latent temporal structure and thus perform the stochastic state transition hierarchically and is applied to implement the jumpy imagination ability in imagination-augmented agent-learning in order to improve the efficiency of the imagination.
VideoFlow: A Conditional Flow-Based Model for Stochastic Video Generation
This work is the first to propose multi-frame video prediction with normalizing flows, which allows for direct optimization of the data likelihood, and produces high-quality stochastic predictions.
Learning World Graphs to Accelerate Hierarchical Reinforcement Learning
A thorough ablation study is performed to evaluate the proposed graph abstraction over the environment structure to accelerate the learning of these tasks with significant advantages from the proposed framework over baselines that lack world graph knowledge in terms of performance and efficiency.
Hierarchical Multiscale Recurrent Neural Networks
A novel multiscale approach, called the hierarchical multiscales recurrent neural networks, which can capture the latent hierarchical structure in the sequence by encoding the temporal dependencies with different timescales using a novel update mechanism is proposed.
An Architecture for Deep, Hierarchical Generative Models
We present an architecture which lets us train deep, directed generative models with many layers of latent variables. We include deterministic paths between all latent variables and the generated
Clockwork Variational Autoencoders
This work introduces the Clockwork VAE (CW-VAE), a video prediction model that leverages a hierarchy of latent sequences, where higher levels tick at slower intervals, and confirms that slower levels learn to represent objects that change more slowly in the video, and faster levels learning to represent faster objects.
Deep Predictive Coding Networks for Video Prediction and Unsupervised Learning
The results suggest that prediction represents a powerful framework for unsupervised learning, allowing for implicit learning of object and scene structure.
Improved Conditional VRNNs for Video Prediction
This work proposes a hierarchy of latent variables, which defines a family of flexible prior and posterior distributions in order to better model the probability of future sequences and validate the proposal through a series of ablation experiments.
Adaptive Skip Intervals: Temporal Abstraction for Recurrent Dynamical Models
It is shown that there are prediction tasks for which the model can gain both computational efficiency and prediction accuracy by allowing the model to make predictions at a sampling rate which it can choose itself.
Learning and Querying Fast Generative Models for Reinforcement Learning
It is demonstrated that agents which query these models for decision making outperform strong model-free baselines on the game MSPACMAN, demonstrating the potential of using learned environment models for planning.