Learn More
The prefrontal cortex subserves executive control--that is, the ability to select actions or thoughts in relation to internal goals. Here, we propose a theory that draws upon concepts from information theory to describe the architecture of executive control in the lateral prefrontal cortex. Supported by evidence from brain imaging in human subjects, the(More)
Stimulus-evoked neural activity is attenuated on stimulus repetition (repetition suppression), a phenomenon that is attributed to largely automatic processes in sensory neurons. By manipulating the likelihood of stimulus repetition, we found that repetition suppression in the human brain was reduced when stimulus repetitions were improbable (and thus,(More)
Incoming sensory information is often ambiguous, and the brain has to make decisions during perception. "Predictive coding" proposes that the brain resolves perceptual ambiguity by anticipating the forthcoming sensory environment, generating a template against which to match observed sensory evidence. We observed a neural representation of predicted(More)
Artificial neural networks are remarkably adept at sensory processing, sequence learning and reinforcement learning, but are limited in their ability to represent variables and data structures and to store data over long timescales, owing to the lack of an external memory. Here we introduce a machine learning model called a differentiable neural computer(More)
Visual cognition is limited by computational capacity, because the brain can process only a fraction of the visual sensorium in detail, and by the inherent ambiguity of the information entering the visual system. Two mechanisms mitigate these burdens: attention prioritizes stimulus processing on the basis of motivational relevance, and expectations(More)
Perceptual inference is biased by foreknowledge about what is probable or possible. How prior expectations are neurally represented during visual perception, however, remains unknown. We used functional magnetic resonance imaging to measure brain activity in humans judging simple visual stimuli. Perceptual decisions were either biased in favor of a single(More)
Visual cortex is traditionally viewed as a hierarchy of neural feature detectors, with neural population responses being driven by bottom-up stimulus features. Conversely, "predictive coding" models propose that each stage of the visual hierarchy harbors two computationally distinct classes of processing unit: representational units that encode the(More)
Categorical choices are preceded by the accumulation of sensory evidence in favor of one action or another. Current models describe evidence accumulation as a continuous process occurring at a constant rate, but this view is inconsistent with accounts of a psychological refractory period during sequential information processing. During multisample(More)
Episodic memories consist of semantic information coupled with a rich array of contextual detail. Here, we investigate the neural processes by which information about the sensory context of a learning event is "bound" to the semantic representation of the to-be-encoded item. We present evidence that item-context binding during encoding is mediated by(More)
According to signal detection theoretical analyses, visual signals occurring at a cued location are detected more accurately, whereas frequently occurring ones are reported more often but are not better distinguished from noise. However, conventional analyses that estimate sensitivity and bias by comparing true- and false-positive rates offer limited(More)