Learn More
Despite recent breakthroughs in the applications of deep neural networks, one setting that presents a persistent challenge is that of " one-shot learning. " Traditional gradient-based networks require a lot of data to learn, often through extensive iterative training. When new data is encountered, the models must inefficiently relearn their parameters to(More)
Despite recent breakthroughs in the applications of deep neural networks, one setting that presents a persistent challenge is that of " one-shot learning. " Traditional gradient-based networks require a lot of data to learn, often through extensive iterative training. When new data is encountered, the models must inefficiently relearn their parameters to(More)
Memories are not static but continue to be processed after encoding. This is thought to allow the integration of related episodes via the identification of patterns. Although this idea lies at the heart of contemporary theories of systems consolidation, it has yet to be demonstrated experimentally. Using a modified water-maze paradigm in which platforms are(More)
The structure-guided design of chloride-conducting channelrhodopsins has illuminated mechanisms underlying ion selectivity of this remarkable family of light-activated ion channels. The first generation of chloride-conducting channelrhodopsins, guided in part by development of a structure-informed electrostatic model for pore selectivity, included both the(More)
Our world can be succinctly and compactly described as structured scenes of objects and relations. A typical room, for example, contains salient objects such as tables, chairs and books, and these objects typically relate to each other by their underlying causes and semantics. This gives rise to correlated features, such as position, function and shape.(More)
We consider the general problem of modeling temporal data with long-range dependencies , wherein new observations are fully or partially predictable based on temporally-distant, past observations. A sufficiently powerful temporal model should separate predictable elements of the sequence from unpredictable elements , express uncertainty about those(More)
Over the course of systems consolidation, there is a switch from a reliance on detailed episodic memories to generalized schematic memories. This switch is sometimes referred to as "memory transformation." Here we demonstrate a previously unappreciated benefit of memory transformation, namely, its ability to enhance reinforcement learning in a dynamic(More)
Event memories are stored in hippocampal-cortical networks. In this issue of Neuron, two studies, Cowansage et al. (2014) and Tanaka et al. (2014), tag active cells during memory encoding and optogenetically manipulate the activity of these "engram" cells during subsequent recall to reveal how hippocampal and cortical cell ensembles interact during(More)
Relational reasoning is a central component of generally intelligent behavior, but has proven difficult for neural networks to learn. In this paper we describe how to use Relation Networks (RNs) as a simple plug-and-play module to solve problems that fundamentally hinge on relational reasoning. We tested RN-augmented networks on three tasks: visual question(More)
  • 1