• Publications
  • Influence
Neural scene representation and rendering
TLDR
A scene-internalizing computer program To train a computer to “recognize” elements of a scene supplied by its visual sensors, computer scientists typically use millions of images painstakingly labeled by humans. Expand
  • 284
  • 45
  • PDF
Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity
TLDR
The principles by which networks of neurons compute, and how spike-timing dependent plasticity of synaptic weights generates and maintains their computational function, are unknown. Expand
  • 238
  • 24
  • PDF
Imagination-Augmented Agents for Deep Reinforcement Learning
TLDR
We introduce Imagination-Augmented Agents, a novel architecture for deep reinforcement learning combining model-free and model-based aspects. Expand
  • 319
  • 23
  • PDF
Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons
TLDR
We propose a neural network model and show that under some conditions the stochastic firing activity of networks of spiking neurons can be interpreted as probabilistic inference via Markov chain Monte Carlo (MCMC) sampling. Expand
  • 307
  • 23
  • PDF
BLACK BOX VARIATIONAL INFERENCE FOR STATE SPACE MODELS
Latent variable time-series models are among the most heavily used tools from machine learning and applied statistics. These models have the advantage of learning latent structure both from noisyExpand
  • 110
  • 17
  • PDF
Empirical models of spiking in neural populations
TLDR
We argue that in the cortex, where firing exhibits extensive correlations in both time and space and where a typical sample of neurons still reflects only a very small fraction of the local population, the most appropriate model captures shared variability by a low-dimensional latent process evolving with smooth dynamics, rather than by putative direct coupling. Expand
  • 172
  • 15
  • PDF
Tag-Trigger-Consolidation: A Model of Early and Late Long-Term-Potentiation and Depression
TLDR
We present a mathematical model that describes these different phases of synaptic plasticity. Expand
  • 120
  • 12
  • PDF
Connectivity, Dynamics, and Memory in Reservoir Computing with Binary and Analog Neurons
TLDR
We address this apparent dichotomy by investigating the influence of the network connectivity (parameterized by the neuron in-degree) on a family of network models that interpolates between analog and binary networks. Expand
  • 117
  • 11
  • PDF
Spike-Frequency Adapting Neural Ensembles: Beyond Mean Adaptation and Renewal Theories
TLDR
We propose a Markov process model for spike-frequency adapting neural ensembles that synthesizes existing mean-adaptation approaches, population density methods, and inhomogeneous renewal theory by accounting for correlations between subsequent interspike intervals. Expand
  • 70
  • 8
  • PDF
High-dimensional neural spike train analysis with generalized count linear dynamical systems
TLDR
We develop the generalized count linear dynamical system, which relaxes the Poisson assumption by using a more general exponential family for count data and demonstrate performance improvements over state-of-the-art methods, both in capturing the variance structure of the data and in held-out prediction. Expand
  • 31
  • 7
  • PDF
...
1
2
3
4
5
...