• Corpus ID: 237364060

Bubblewrap: Online tiling and real-time flow prediction on neural manifolds

@article{Draelos2021BubblewrapOT,
  title={Bubblewrap: Online tiling and real-time flow prediction on neural manifolds},
  author={Anne W. Draelos and Pranjal Gupta and Na Young Jun and Chaichontat Sriworarat and John M. Pearson},
  journal={Advances in neural information processing systems},
  year={2021},
  volume={34},
  pages={
          6062-6074
        }
}
While most classic studies of function in experimental neuroscience have focused on the coding properties of individual neurons, recent developments in recording technologies have resulted in an increasing emphasis on the dynamics of neural populations. This has given rise to a wide variety of models for analyzing population activity in relation to experimental variables, but direct testing of many neural population hypotheses requires intervening in the system based on current neural state… 

Figures from this paper

References

SHOWING 1-10 OF 61 REFERENCES

Linear dynamical neural population models through nonlinear embeddings

A body of recent work in modeling neural activity focuses on recovering low-dimensional latent features that capture the statistical structure of large-scale neural populations. Most such approaches

Variational Online Learning of Neural Dynamics

A flexible online learning framework for latent non-linear state dynamics and filtered latent states is developed using the stochastic gradient variational Bayes approach and can incorporate non-trivial distributions of observation noise and has constant time and space complexity.

Inferring single-trial neural population dynamics using sequential auto-encoders

LFADS, a deep learning method for analyzing neural population activity, can extract neural dynamics from single-trial recordings, stitch separate datasets into a single model, and infer perturbations, for example, from behavioral choices to these dynamics.

Online Neural Connectivity Estimation with Noisy Group Testing

By stimulating small ensembles of neurons, it is shown that it is possible to recover binarized network connectivity with a number of tests that grows only logarithmically with population size under minimal statistical assumptions, and it is proved that the approach, which reduces to an efficiently solvable convex optimization problem, is equivalent to Variational Bayesian inference on the binary connection weights.

A theory of multineuronal dimensionality, dynamics and measurement

A theory is presented that reveals conceptual insights into how task complexity governs both neural dimensionality and accurate recovery of dynamic portraits, thereby providing quantitative guidelines for future large-scale experimental design.

Accurate Estimation of Neural Population Dynamics without Spike Sorting

It is found that neural dynamics and scientific conclusions are quite similar using multi-unit threshold crossings in place of sorted neurons, which unlocks existing data for new analyses and informs the design of new electrode arrays for laboratory and clinical use.

Tree-Structured Recurrent Switching Linear Dynamical Systems for Multi-Scale Modeling

This work develops a class of models that aims to achieve both simultaneously, smoothly interpolating between simple descriptions and more complex, yet also more accurate models, and outperform existing methods in both interpretability and predictive capability.

Bayesian Learning and Inference in Recurrent Switching Linear Dynamical Systems

This work develops a model class and Bayesian inference algorithms that not only discover these dynamical units but also, by learning how transition probabilities depend on observations or continuous latent states, explain their switching behavior.

Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework

A framework for gradient descent-based training of excitatory-inhibitory RNNs that can incorporate a variety of biological knowledge is described and an implementation based on the machine learning library Theano is provided, whose automatic differentiation capabilities facilitate modifications and extensions.

Dimensionality reduction for large-scale neural recordings

This review examines three important motivations for population studies: single-trial hypotheses requiring statistical power, hypotheses of population response structure and exploratory analyses of large data sets, and practical advice about selecting methods and interpreting their outputs.
...