Corpus ID: 9447219

Learned in Translation: Contextualized Word Vectors

@inproceedings{McCann2017LearnedIT,
  title={Learned in Translation: Contextualized Word Vectors},
  author={B. McCann and James Bradbury and Caiming Xiong and R. Socher},
  booktitle={NIPS},
  year={2017}
}
  • B. McCann, James Bradbury, +1 author R. Socher
  • Published in NIPS 2017
  • Computer Science
  • Computer vision has benefited from initializing multiple deep layers with weights pretrained on large supervised training sets like ImageNet. [...] Key Result For fine-grained sentiment analysis and entailment, CoVe improves performance of our baseline models to the state of the art.Expand Abstract
    Contextualized Word Representations for Self-Attention Network
    Deep contextualized word representations
    3884
    Word-Class Embeddings for Multiclass Text Classification
    1
    Improved Word Sense Disambiguation Using Pre-Trained Contextualized Word Representations
    8
    Enhancing a Text Summarization System with ELMo
    Transfer Pretrained Sentence Encoder to Sentiment Classification

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 75 REFERENCES
    Neural Semantic Encoders
    105
    Skip-Thought Vectors
    1464
    Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
    3726
    Glove: Global Vectors for Word Representation
    14140
    Learning Word Vectors for Sentiment Analysis
    1892