• Publications
  • Influence
BLACK BOX VARIATIONAL INFERENCE FOR STATE SPACE MODELS
TLDR
A structured Gaussian variational approximate posterior is proposed that carries the same intuition as the standard Kalman filter-smoother but permits us to use the same inference approach to approximate the posterior of much more general, nonlinear latent variable generative models.
Linear dynamical neural population models through nonlinear embeddings
A body of recent work in modeling neural activity focuses on recovering low-dimensional latent features that capture the statistical structure of large-scale neural populations. Most such approaches
Bayesian entropy estimation for countable discrete distributions
TLDR
This work considers the problem of estimating Shannon's entropy H from discrete data, in cases where the number of possible symbols is unknown or even countably infinite, and derives a family of continuous measures for mixing Pitman-Yor processes to produce an approximately flat prior over H.
Spectral methods for neural characterization using generalized quadratic models
TLDR
The generalized quadratic model provides a natural framework for combining multi-dimensional stimulus sensitivity and spike-history dependencies within a single model and provides closed-form estimators under a large class of non-Gaussian stimulus distributions.
Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data
TLDR
This work discusses several regularized estimators for MI that employ priors based on the Dirichlet distribution, and examines the performance of these estimators with a variety of simulated datasets and shows that, surprisingly, quasi-Bayesian estimators generally outperform the authors' Bayesian estimator.
Low-dimensional models of neural population activity in sensory cortical circuits
TLDR
A statistical model of neural population activity that integrates a nonlinear receptive field model with a latent dynamical model of ongoing cortical activity that captures temporal dynamics and correlations due to shared stimulus drive as well as common noise is introduced.
Fast amortized inference of neural activity from calcium imaging data with variational autoencoders
TLDR
The generality of the method is demonstrated by proposing the first probabilistic approach for separating backpropagating action potentials from putative synaptic inputs in calcium imaging of dendritic spines, and it is shown that amortization can be applied flexibly to a wide range of nonlinear generative models.
Bayesian entropy estimation for binary spike train data using parametric prior knowledge
TLDR
Bayesian estimators for the entropy of binary spike trains are developed using priors designed to flexibly exploit the statistical structure of simultaneously-recorded spike responses and a compact representation of the data and prior is devised that allow for computationally efficient implementations of Bayesian least squares and empirical Bayes entropy estimators with large numbers of neurons.
Universal models for binary spike patterns using centered Dirichlet processes
TLDR
This work proposes a family of "universal" models for binary spike patterns, where universality refers to the ability to model arbitrary distributions over all 2m binary patterns, and constructs universal models using a Dirichlet process centered on a well-behaved parametric base measure.
Bayesian estimation of discrete entropy with mixtures of stick-breaking priors
TLDR
A family of continuous mixing measures is defined such that the resulting mixture of Dirichlet or Pitman-Yor processes produces an approximately flat prior over H, meaning the prior strongly determines the estimate in the under-sampled regime.
...
1
2
3
...