Learn More
The mushroom body is an insect brain structure required for olfactory learning. Its principal neurons, the Kenyon cells (KCs), form a large cell population. The neuronal populations from which their olfactory input derives (olfactory sensory and projection neurons) can be identified individually by genetic, anatomical, and physiological criteria. We ask(More)
Entorhinal grid cells in mammals fire as a function of animal location, with spatially periodic response patterns. This nonlocal periodic representation of location, a local variable, is unlike other neural codes. There is no theoretical explanation for why such a code should exist. We examined how accurately the grid code with noisy neurons allows an ideal(More)
The ability to store and later use information is essential for a variety of adaptive behaviors, including integration, learning, generalization, prediction and inference. In this Review, we survey theoretical principles that can allow the brain to construct persistent states for memory. We identify requirements that a memory system must satisfy and analyze(More)
A Hopfield network is an auto-associative, distributive model of neu-ral memory storage and retrieval. A form of error-correcting code, the Hopfield network can learn a set of patterns as stable points of the network dynamic, and retrieve them from noisy inputs – thus Hopfield networks are their own decoders. Unlike in coding theory, where the information(More)
Self-localization during navigation with noisy sensors in an ambiguous world is computationally challenging, yet animals and humans excel at it. In robotics, Simultaneous Location and Mapping (SLAM) algorithms solve this problem though joint sequential probabilistic inference of their own coordinates and those of external spatial landmarks. We generate the(More)
The brain must robustly store a large number of memories, corresponding to the many events and scenes a person encounters over a lifetime. However, the number of memory states in existing neural network models either grows weakly with network size or recall performance fails catastrophically with vanishingly little noise. Here we show that it is possible to(More)
Information processing in the presence of noise has been a key challenge in multiple disciplines including computer science, communications, and neuroscience. Among such noise-reduction mechanisms, the shift-map code represents an analog variable by its residues with respect to distinct moduli (that are chosen as geometric scalings of an integer). Motivated(More)
Shift-map codes have been studied as joint source-channel codes for continuous sources. These codes are useful in delay-limited scenarios and also provide better tolerance to deviations of the signal-to-noise ratio (SNR) from a target SNR, compared to separate source and channel coding. This paper defines a generalized family of shift-map codes that share a(More)
Acknowledgments It is my privilege to have Professor Ila Fiete as my advisor for my graduate work. Her insights regarding both scholarly and practical matters in neuroscience have always encouraged me to take the next step to becoming a scientist. I am grateful to Professor Sriram Vishwanath and Professor Jonathan Pillow for their support and helpful(More)
  • 1