Discovering the Compositional Structure of Vector Representations with Role Learning Networks

@article{Soulos2020DiscoveringTC,
  title={Discovering the Compositional Structure of Vector Representations with Role Learning Networks},
  author={Paul Soulos and R. Thomas McCoy and Tal Linzen and P. Smolensky},
  journal={ArXiv},
  year={2020},
  volume={abs/1910.09113}
}
How can neural networks perform so well on compositional tasks even though they lack explicit compositional representations? We use a novel analysis technique called ROLE to show that recurrent neural networks perform well on such tasks by converging to solutions which implicitly represent symbolic structure. This method uncovers a symbolic structure which, when properly embedded in vector space, closely approximates the encodings of a standard seq2seq network trained to perform the… Expand
12 Citations

Figures and Tables from this paper

Automatic Construction of Tensor Product Variable Binding Neural Networks for Neural-Symbolic Intelligent Systems
  • A. Demidovskij
  • Computer Science
  • 2020 International Conference on Electrical, Communication, and Computer Engineering (ICECCE)
  • 2020
  • 2
Compositionality Decomposed: How do Neural Networks Generalise?
  • 35
  • PDF
The compositionality of neural networks: integrating symbolism and connectionism
  • 17
  • PDF
Towards Designing Linguistic Assessments Aggregation as a Distributed Neuroalgorithm
  • A. Demidovskij, E. Babkin
  • Computer Science
  • 2020 XXIII International Conference on Soft Computing and Measurements (SCM)
  • 2020
  • 2
Probing Linguistic Systematicity
  • 10
  • PDF
...
1
2
...

References

SHOWING 1-10 OF 66 REFERENCES
Measuring Compositionality in Representation Learning
  • 48
  • PDF
RNNs Implicitly Implement Tensor Product Representations
  • 23
  • PDF
Question-Answering with Grammatically-Interpretable Representations
  • 35
  • PDF
Deep RNNs Encode Soft Hierarchical Syntax
  • 67
  • PDF
Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks
  • 298
  • PDF
Dissecting Contextual Word Embeddings: Architecture and Representation
  • 193
  • PDF
What Does BERT Learn about the Structure of Language?
  • 264
  • PDF
...
1
2
3
4
5
...