Learn More
The mushroom body is an insect brain structure required for olfactory learning. Its principal neurons, the Kenyon cells (KCs), form a large cell population. The neuronal populations from which their olfactory input derives (olfactory sensory and projection neurons) can be identified individually by genetic, anatomical, and physiological criteria. We ask(More)
Entorhinal grid cells in mammals fire as a function of animal location, with spatially periodic response patterns. This nonlocal periodic representation of location, a local variable, is unlike other neural codes. There is no theoretical explanation for why such a code should exist. We examined how accurately the grid code with noisy neurons allows an ideal(More)
We examined simultaneously recorded spikes from multiple rat grid cells, to explain mechanisms underlying their activity. Among grid cells with similar spatial periods, the population activity was confined to lie close to a two-dimensional (2D) manifold: grid cells differed only along two dimensions of their responses and otherwise were nearly identical.(More)
Grid cell responses develop gradually after eye opening, but little is known about the rules that govern this process. We present a biologically plausible model for the formation of a grid cell network. An asymmetric spike time-dependent plasticity rule acts upon an initially unstructured network of spiking neurons that receive inputs encoding animal(More)
Figure S2: Examples of concatenated subthreshold odor responses. Each subthreshold response (0-2s following odor stimulus onset) is an average across trials of the filtered and baseline-subtracted membrane voltage. Examples are shown for two KC pairs, and such concatenated vectors were used for pairwise distance measurements (Figure 3). KC18 was tested with(More)
The ability to store and later use information is essential for a variety of adaptive behaviors, including integration, learning, generalization, prediction and inference. In this Review, we survey theoretical principles that can allow the brain to construct persistent states for memory. We identify requirements that a memory system must satisfy and analyze(More)
Grid cells, defined by their striking periodic spatial responses in open 2D arenas, appear to respond differently on 1D tracks: the multiple response fields are not periodically arranged, peak amplitudes vary across fields, and the mean spacing between fields is larger than in 2D environments. We ask whether such 1D responses are consistent with the(More)
A Hopfield network is an auto-associative, distributive model of neural memory storage and retrieval. A form of error-correcting code, the Hopfield network can learn a set of patterns as stable points of the network dynamic, and retrieve them from noisy inputs – thus Hopfield networks are their own decoders. Unlike in coding theory, where the information(More)
Information processing in the presence of noise has been a key challenge in multiple disciplines including computer science, communications, and neuroscience. Among such noise-reduction mechanisms, the shiftmap code represents an analog variable by its residues with respect to distinct moduli (that are chosen as geometric scalings of an integer). Motivated(More)
The brain must robustly store a large number of memories, corresponding to the many events and scenes a person encounters over a lifetime. However, the number of memory states in existing neural network models either grows weakly with network size or recall performance fails catastrophically with vanishingly little noise. Here we show that it is possible to(More)