Corpus ID: 214727914

A Hierarchical Transformer for Unsupervised Parsing

@article{Thillaisundaram2020AHT,
  title={A Hierarchical Transformer for Unsupervised Parsing},
  author={Ashok Thillaisundaram},
  journal={ArXiv},
  year={2020},
  volume={abs/2003.13841}
}
  • Ashok Thillaisundaram
  • Published 2020
  • Mathematics, Computer Science
  • ArXiv
  • The underlying structure of natural language is hierarchical; words combine into phrases, which in turn form clauses. An awareness of this hierarchical structure can aid machine learning models in performing many linguistic tasks. However, most such models just process text sequentially and there is no bias towards learning hierarchical structure encoded into their architecture. In this paper, we extend the recent transformer model (Vaswani et al., 2017) by enabling it to learn hierarchical… CONTINUE READING

    Topics from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-8 OF 8 REFERENCES
    Tree Transformer: Integrating Tree Structures into Self-Attention
    15
    Attention is All you Need
    10461
    Hierarchical Multiscale Recurrent Neural Networks
    327
    A Clockwork RNN
    296
    , and Aaron Courville . Ordered neurons : Integrating tree structures into recurrent neural networks
    • 2014
    Attention is all you
    • 2017
    Hierarchical multiscale recurrent neural networks. 2016
    • 2016