Aligning context-based statistical models of language with brain activity during reading

Abstract

Many statistical models for natural language processing exist, including context-based neural networks that (1) model the previously seen context as a latent feature vector, (2) integrate successive words into the context using some learned representation (embedding), and (3) compute output probabilities for incoming words given the context. On the other… (More)

Topics

9 Figures and Tables

Slides referencing similar topics