• Publications
  • Influence
ARACNE: An Algorithm for the Reconstruction of Gene Regulatory Networks in a Mammalian Cellular Context
TLDR
We present ARACNE, a novel algorithm, using microarray expression profiles, specifically designed to scale up to the complexity of regulatory networks in mammalian cells, yet general enough to address a wider range of network deconvolution problems. Expand
  • 2,213
  • 248
  • PDF
Entropy and Inference, Revisited
TLDR
We study properties of popular near–uniform (Dirichlet) priors for learning undersampled probability distributions on discrete nonmetric spaces and show that they lead to disastrous results. Expand
  • 218
  • 15
  • PDF
Information Transduction Capacity of Noisy Biochemical Signaling Networks
Noise limits information transfer through a single signaling pathway in a single cell to just one bit. Molecular noise restricts the ability of an individual cell to resolve input signals ofExpand
  • 381
  • 14
  • PDF
Entropy and information in neural spike trains: progress on the sampling problem.
TLDR
The major problem in information theoretic analysis of neural responses and other biological data is the reliable estimation of entropy-like quantities from small samples, where other techniques fail. Expand
  • 257
  • 13
  • PDF
Complexity through nonextensivity
The problem of defining and studying complexity of a time series has interested people for years. In the context of dynamical systems, Grassberger has suggested that a slow approach of the entropy toExpand
  • 82
  • 5
  • PDF
Neural coding of natural stimuli: information at sub-millisecond resolution
TLDR
We have found that under natural stimulus conditions the fly visual system generates spikes and interspike intervals with extraordinary temporal precision. Expand
  • 102
  • 4
  • PDF
Information theory, multivariate dependence, and genetic network inference
TLDR
We define the concept of dependence among multiple variables using maximum entropy techniques and introduce a graphical notation to denote the dependencies. Expand
  • 38
  • 2
  • PDF
Coincidences and Estimation of Entropies of Random Variables with Large Cardinalities
  • I. Nemenman
  • Mathematics, Computer Science
  • Entropy
  • 19 December 2011
TLDR
We perform an asymptotic analysis of the NSB estimator of entropy of a discrete random variable and show that the estimator has a non-trivial limit for a large cardinality of the studied variable. Expand
  • 23
  • 2
  • PDF
Simple biochemical networks allow accurate sensing of multiple ligands with a single receptor
TLDR
We show that, with cross-talk, concentration of more than one chemical species can be inferred from one receptor, provided that the stochastic temporal sequence of receptor binding and unbinding events is accessible instead of its mean occupancy. Expand
  • 16
  • 2
  • PDF
Information theory and learning: a physical approach
  • I. Nemenman
  • Computer Science, Physics
  • ArXiv
  • 8 September 2000
TLDR
We define predictive information as the mutual information between the past and the future of a time series, discuss its behavior as a function of the length of the series, and explain how other quantities of interest studied previously in learning theory emerge from this universally definable concept. Expand
  • 14
  • 2
  • PDF
...
1
2
3
4
5
...