Learning Spoken Language Representations with Neural Lattice Language Modeling

  title={Learning Spoken Language Representations with Neural Lattice Language Modeling},
  author={Chao-Wei Huang and Yun-Nung Vivian},
  • Chao-Wei Huang, Yun-Nung Vivian
  • Published in ACL 2020
  • Computer Science
  • Pre-trained language models have achieved huge improvement on many NLP tasks. However, these methods are usually designed for written text, so they do not consider the properties of spoken language. Therefore, this paper aims at generalizing the idea of language model pre-training to lattices generated by recognition systems. We propose a framework that trains neural lattice language models to provide contextualized representations for spoken language understanding tasks. The proposed two-stage… CONTINUE READING


    Adapting Pretrained Transformer to Lattices for Spoken Language Understanding
    • 8
    • PDF
    Discriminative spoken language understanding using word confusion networks
    • 88
    • PDF
    Joint semantic utterance classification and slot filling with recursive neural networks
    • 96
    • PDF
    Using Recurrent Neural Networks for Slot Filling in Spoken Language Understanding
    • 388
    • PDF
    Neural Lattice Language Models
    • 14
    • PDF
    An Embarrassingly Simple Approach for Transfer Learning from Pretrained Language Models
    • 38
    • PDF
    Unsupervised Transfer Learning for Spoken Language Understanding in Intelligent Agents
    • 20
    • PDF
    Spoken language understanding using long short-term memory neural networks
    • 206
    • PDF