• Publications
  • Influence
Neural scene representation and rendering
TLDR
The Generative Query Network (GQN) is introduced, a framework within which machines learn to represent scenes using only their own sensors, demonstrating representation learning without human labels or domain knowledge.
Imagination-Augmented Agents for Deep Reinforcement Learning
TLDR
Imagination-Augmented Agents (I2As), a novel architecture for deep reinforcement learning combining model-free and model-based aspects, shows improved data efficiency, performance, and robustness to model misspecification compared to several baselines.
Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity
TLDR
The results suggest that the experimentally observed spontaneous activity and trial-to-trial variability of cortical neurons are essential features of their information processing capability, since their functional role is to represent probability distributions rather than static neural codes.
BLACK BOX VARIATIONAL INFERENCE FOR STATE SPACE MODELS
TLDR
A structured Gaussian variational approximate posterior is proposed that carries the same intuition as the standard Kalman filter-smoother but permits us to use the same inference approach to approximate the posterior of much more general, nonlinear latent variable generative models.
Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons
TLDR
A neural network model is proposed and it is shown by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time.
Empirical models of spiking in neural populations
TLDR
This work argues that in the cortex, where firing exhibits extensive correlations in both time and space and where a typical sample of neurons still reflects only a very small fraction of the local population, the most appropriate model captures shared variability by a low-dimensional latent process evolving with smooth dynamics, rather than by putative direct coupling.
Connectivity, Dynamics, and Memory in Reservoir Computing with Binary and Analog Neurons
TLDR
Investigating the influence of the network connectivity (parameterized by the neuron in-degree) on a family of network models that interpolates between analog and binary networks reveals that the phase transition between ordered and chaotic network behavior of binary circuits qualitatively differs from the one in analog circuits, leading to decreased computational performance observed in binary circuits that are densely connected.
Tag-Trigger-Consolidation: A Model of Early and Late Long-Term-Potentiation and Depression
TLDR
A mathematical model is presented that describes the induction of long-term potentiation and depression (LTP/LTD) during the early phase of synaptic plasticity, the setting of synaptic tags, a trigger process for protein synthesis, and a slow transition leading to synaptic consolidation during the late phase ofaptic plasticity.
Spike-Frequency Adapting Neural Ensembles: Beyond Mean Adaptation and Renewal Theories
TLDR
It is shown that the full five-dimensional master equation for a conductance-based integrate-and-fire neuron with spike-frequency adaptation and a relative refractory mechanism driven by Poisson spike trains can be reduced to a two-dimensional generalization of the proposed Markov process by an adiabatic elimination of fast variables.
Learning model-based planning from scratch
TLDR
The "Imagination-based Planner" is introduced, the first model-based, sequential decision-making agent that can learn to construct, evaluate, and execute plans, and also learn elaborate planning strategies in a discrete maze-solving task.
...
1
2
3
4
5
...