Representation of Linguistic Form and Function in Recurrent Neural Networks

@article{Kdr2017RepresentationOL,
  title={Representation of Linguistic Form and Function in Recurrent Neural Networks},
  author={{\'A}kos K{\'a}d{\'a}r and Grzegorz Chrupala and Afra Alishahi},
  journal={Computational Linguistics},
  year={2017},
  volume={43},
  pages={761-780}
}
We present novel methods for analyzing the activation patterns of recurrent neural networks from a linguistic point of view and explore the types of linguistic structure they learn. As a case study, we use a standard standalone language model, and a multi-task gated recurrent network architecture consisting of two parallel pathways with shared word embeddings: The Visual pathway is trained on predicting the representations of the visual scene corresponding to an input sentence, and the Textual… CONTINUE READING
Highly Cited
This paper has 60 citations. REVIEW CITATIONS
Related Discussions
This paper has been referenced on Twitter 30 times. VIEW TWEETS

Citations

Publications citing this paper.
Showing 1-10 of 44 extracted citations

61 Citations

020402016201720182019
Citations per Year
Semantic Scholar estimates that this publication has 61 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 37 references

Video Paragraph Captioning Using Hierarchical Recurrent Neural Networks

2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) • 2016
View 1 Excerpt

Similar Papers

Loading similar papers…