Corpus ID: 201633105

An ELMo-inspired approach to SemDeep-5's Word-in-Context task

@inproceedings{Ansell2019AnEA,
  title={An ELMo-inspired approach to SemDeep-5's Word-in-Context task},
  author={Alan Ansell and Felipe Bravo-Marquez and B. Pfahringer},
  booktitle={SemDeep@IJCAI},
  year={2019}
}
  • Alan Ansell, Felipe Bravo-Marquez, B. Pfahringer
  • Published in SemDeep@IJCAI 2019
  • Psychology, Computer Science
  • This paper describes a submission to the Word-in-Context competition for the IJCAI 2019 SemDeep-5 workshop. The task is to determine whether a given focus word is used in the same or different senses in two contexts. We took an ELMo-inspired approach similar to the baseline model in the task description paper, where contextualized representations are obtained for the focus words and a classification is made according to the degree of similarity between these representations. Our model had a few… CONTINUE READING
    4 Citations

    Tables and Topics from this paper.

    LIAAD at SemDeep-5 Challenge: Word-in-Context (WiC)
    • 5
    • Highly Influenced
    • PDF

    References

    SHOWING 1-10 OF 12 REFERENCES
    Deep contextualized word representations
    • 4,595
    • Highly Influential
    • PDF
    context2vec: Learning Generic Context Embedding with Bidirectional LSTM
    • 267
    • PDF
    Improving Word Representations via Global Context and Multiple Word Prototypes
    • 1,065
    • PDF
    Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space
    • 361
    • PDF
    SensEmbed: Learning Sense Embeddings for Word and Relational Similarity
    • 247
    • Highly Influential
    • PDF
    Do Multi-Sense Embeddings Improve Natural Language Understanding?
    • 184
    • PDF
    Efficient Estimation of Word Representations in Vector Space
    • 15,687
    • PDF
    A Unified Model for Word Sense Representation and Disambiguation
    • 284
    • PDF