Corpus ID: 215768677

SPECTER: Document-level Representation Learning using Citation-informed Transformers

@inproceedings{Cohan2020SPECTERDR,
  title={SPECTER: Document-level Representation Learning using Citation-informed Transformers},
  author={Arman Cohan and Sergey Feldman and Iz Beltagy and Doug Downey and Daniel S. Weld},
  booktitle={ACL},
  year={2020}
}
  • Arman Cohan, Sergey Feldman, +2 authors Daniel S. Weld
  • Published in ACL 2020
  • Computer Science
  • Representation learning is a critical ingredient for natural language processing systems. Recent Transformer language models like BERT learn powerful textual representations, but these models are targeted towards token- and sentence-level training objectives and do not leverage information on inter-document relatedness, which limits their document-level representation power. For applications on scientific documents, such as classification and recommendation, the embeddings power strong… CONTINUE READING

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 59 REFERENCES

    A Comprehensive Survey on Graph Neural Networks

    VIEW 12 EXCERPTS
    HIGHLY INFLUENTIAL

    An Overview of Microsoft Academic Service (MAS) and Applications

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Attention is All you Need

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Improving Textual Network Learning with Variational Homophilic Embeddings

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL