• Publications
  • Influence
The information bottleneck method
TLDR
The variational principle provides a surprisingly rich framework for discussing a variety of problems in signal processing and learning, as will be described in detail elsewhere.
Spikes: Exploring the Neural Code
TLDR
Spikes provides a self-contained review of relevant concepts in information theory and statistical decision theory about the representation of sensory signals in neural spike trains and a quantitative framework is used to pose precise questions about the structure of the neural code.
Weak pairwise correlations imply strongly correlated network states in a neural population
TLDR
It is shown, in the vertebrate retina, that weak correlations between pairs of neurons coexist with strongly collective behaviour in the responses of ten or more neurons, and it is found that this collective behaviour is described quantitatively by models that capture the observed pairwise correlations but assume no higher-order interactions.
Entropy and Information in Neural Spike Trains
TLDR
It is shown how to quantify this information, in bits, free from any assumptions about which features of the spike train or input signal are most important, and this approach is applied to the analysis of experiments on a motion sensitive neuron in the fly visual system.
Stability and Nuclear Dynamics of the Bicoid Morphogen Gradient
TLDR
Both direct photobleaching measurements and indirect estimates of Bicoid-eGFP diffusion constants provide a consistent picture of BICoid transport on short time scales but challenge traditional models of long-range gradient formation.
Probing the Limits to Positional Information
TLDR
This agreement among different measures of accuracy indicates that the Drosophila embryo is not faced with noisy input signals and readout mechanisms; rather, the system exerts precise control over absolute concentrations and responds reliably to small concentration differences, approaching the limits set by basic physical principles.
Statistics of Natural Images: Scaling in the Woods
TLDR
This work gathers images from the woods and finds that these scenes possess an ensemble scale invariance, and this non-Gaussian character cannot be removed through local linear filtering, meaning information is maximized at fixed channel variance.
Efficiency and ambiguity in an adaptive neural code
TLDR
The dynamics of a neural code is examined in the context of stimuli whose statistical properties are themselves evolving dynamically, thus resolving potential ambiguities and approaching the physical limit imposed by statistical sampling and noise.
Adaptive Rescaling Maximizes Information Transmission
TLDR
This work relates an adaptive property of a sensory system directly to its function as a carrier of information about input signals, and gives direct evidence that the scaling of the input/output relation is set to maximize information transmission for each distribution of signals.
Analyzing Neural Responses to Natural Signals: Maximally Informative Dimensions
TLDR
A method that allows for a rigorous statistical analysis of neural responses to natural stimuli that are nongaussian and exhibit strong correlations is proposed, which maximize the mutual information between the neural responses and projections of the stimulus onto low-dimensional subspaces.
...
1
2
3
4
5
...