Representation of linguistic form and function in recurrent neural networks

We present novel methods for analyzing the activation patterns of recurrent neural networks from a linguistic point of view and explore the types of linguistic structure they learn. As a case study, we use a standard standalone language model, and a multi-task gated recurrent network architecture consisting of two parallel pathways with shared word… CONTINUE READING

Topics

Statistics

0102030201620172018
Citations per Year

Citation Velocity: 15

Averaging 15 citations per year over the last 3 years.

Learn more about how we calculate this metric in our FAQ.