Corpus ID: 855563

A Neural Language Model for Dynamically Representing the Meanings of Unknown Words and Entities in a Discourse

@inproceedings{Kobayashi2017ANL,
  title={A Neural Language Model for Dynamically Representing the Meanings of Unknown Words and Entities in a Discourse},
  author={S. Kobayashi and N. Okazaki and Kentaro Inui},
  booktitle={IJCNLP},
  year={2017}
}
  • S. Kobayashi, N. Okazaki, Kentaro Inui
  • Published in IJCNLP 2017
  • Computer Science
  • This study addresses the problem of identifying the meaning of unknown words or entities in a discourse with respect to the word embedding approaches used in neural language models. We proposed a method for on-the-fly construction and exploitation of word embeddings in both the input and output layers of a neural model by tracking contexts. This extends the dynamic entity representation used in Kobayashi et al. (2016) and incorporates a copy mechanism proposed independently by Gu et al. (2016… CONTINUE READING
    11 Citations

    Figures, Tables, and Topics from this paper

    Dynamic Integration of Background Knowledge in Neural NLU Systems
    • 47
    • PDF
    Second-order contexts from lexical substitutes for few-shot learning of word representations
    • 2
    • PDF
    Contextual Augmentation: Data Augmentation by Words with Paradigmatic Relations
    • 130
    • PDF
    Improving Pre-Trained Multilingual Models with Vocabulary Expansion
    • 7
    • PDF
    Representing Movie Characters in Dialogues
    • 2
    • PDF
    Knowledge Efficient Deep Learning for Natural Language Processing
    • PDF
    The Referential Reader: A Recurrent Entity Network for Anaphora Resolution
    • 8
    • PDF
    The Referential Reader : A Recurrent Entity Network for Anaphora Resolution
    Neural Code Completion with Anonymized Variable Names
    • PDF

    References

    SHOWING 1-10 OF 45 REFERENCES
    Pointing the Unknown Words
    • 378
    • Highly Influential
    • PDF
    Do Multi-Sense Embeddings Improve Natural Language Understanding?
    • 187
    • PDF
    Two Discourse Driven Language Models for Semantics
    • 28
    • PDF
    Context-dependent word representation for neural machine translation
    • 50
    • PDF
    Dynamic Entity Representations in Neural Language Models
    • 57
    • Highly Influential
    • PDF
    Character-Aware Neural Language Models
    • 1,247
    • PDF
    Dynamic Entity Representation with Max-pooling Improves Machine Reading
    • 35
    • PDF
    Achieving Open Vocabulary Neural Machine Translation with Hybrid Word-Character Models
    • 304
    • PDF
    A Neural Probabilistic Language Model
    • 4,669
    • PDF