SPECTER: Document-level Representation Learning using Citation-informed Transformers

@inproceedings{Cohan2020SPECTERDR,
  title={SPECTER: Document-level Representation Learning using Citation-informed Transformers},
  author={Arman Cohan and Sergey Feldman and Iz Beltagy and Doug Downey and Daniel S. Weld},
  booktitle={ACL},
  year={2020}
}
Representation learning is a critical ingredient for natural language processing systems. Recent Transformer language models like BERT learn powerful textual representations, but these models are targeted towards token- and sentence-level training objectives and do not leverage information on inter-document relatedness, which limits their document-level representation power. For applications on scientific documents, such as classification and recommendation, accurate embeddings of documents are… Expand
Aspect-based Document Similarity for Research Papers
Document Embedding using piped ELM-GAN Model
  • Arefeh Yavary, H. Sajedi
  • Computer Science
  • 2021 15th International Conference on Ubiquitous Information Management and Communication (IMCOM)
  • 2021
Incorporating Visual Layout Structures for Scientific Text Classification
...
1
2
3
...

References

SHOWING 1-10 OF 59 REFERENCES
A Comprehensive Survey on Graph Neural Networks
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
A Simple but Tough-to-Beat Baseline for Sentence Embeddings
An Overview of Microsoft Academic Service (MAS) and Applications
Improving Textual Network Embedding with Global Attention via Optimal Transport
Improving Textual Network Learning with Variational Homophilic Embeddings
Simplifying Graph Convolutional Networks
Attention is All you Need
Inductive Representation Learning on Large Graphs
SciBERT: A Pretrained Language Model for Scientific Text
...
1
2
3
4
5
...