Corpus ID: 225067750

Keyphrase Extraction with Dynamic Graph Convolutional Networks and Diversified Inference

@article{Zhang2020KeyphraseEW,
  title={Keyphrase Extraction with Dynamic Graph Convolutional Networks and Diversified Inference},
  author={Haoyu Zhang and Dingkun Long and G. Xu and Pengjun Xie and Fei Huang and Ji Wang},
  journal={ArXiv},
  year={2020},
  volume={abs/2010.12828}
}
Keyphrase extraction (KE) aims to summarize a set of phrases that accurately express a concept or a topic covered in a given document. Recently, Sequence-to-Sequence (Seq2Seq) based generative framework is widely used in KE task, and it has obtained competitive performance on various benchmarks. The main challenges of Seq2Seq methods lie in acquiring informative latent document representation and better modeling the compositionality of the target keyphrases set, which will directly affect the… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 40 REFERENCES
Deep Keyphrase Generation
TLDR
Empirical analysis on six datasets demonstrates that the proposed generative model for keyphrase prediction with an encoder-decoder framework achieves a significant performance boost on extracting keyphrases that appear in the source text, but also can generate absent keyphRases based on the semantic meaning of the text. Expand
DivGraphPointer: A Graph Pointer Network for Extracting Diverse Keyphrases
TLDR
An end-to-end method called DivGraphPointer is presented for extracting a set of diversified keyphrases from a document that combines the advantages of traditional graph-based ranking methods and recent neural network-based approaches. Expand
Title-Guided Encoding for Keyphrase Generation
TLDR
This work introduces a new model called Title-Guided Network (TG-Net) for automatic keyphrase generation task based on the encoder-decoder architecture with two new features: the title is additionally employed as a query-like input, and a title-guided encoder gathers the relevant information from the title to each word in the document. Expand
Incorporating Linguistic Constraints into Keyphrase Generation
TLDR
The parallel Seq2Seq network with the coverage attention to alleviate the overlapping phrase problem is proposed and the coverage vector is introduced to keep track of the attention history and to decide whether the parts of source text have been covered by existing generated keyphrases. Expand
EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs
TLDR
This work proposes EvolveGCN, which adapts the graph convolutional network (GCN) model along the temporal dimension without resorting to node embeddings, and captures the dynamism of the graph sequence through using an RNN to evolve the GCN parameters. Expand
Neural Keyphrase Generation via Reinforcement Learning with Adaptive Rewards
TLDR
This work proposes a reinforcement learning (RL) approach for keyphrase generation, with an adaptive reward function that encourages a model to generate both sufficient and accurate keyphrases, and introduces a new evaluation method that incorporates name variations of the ground-truth keyphRases using the Wikipedia knowledge base. Expand
Incorporating Expert Knowledge into Keyphrase Extraction
TLDR
This paper learns keyphrase taggers for research papers using token-based features incorporating linguistic, surfaceform, and document-structure information through sequence labeling and demonstrates that using document features alone, the tagger trained with Conditional Random Fields performs on-par with existing state-of-the-art systems. Expand
Encoding Sentences with Graph Convolutional Networks for Semantic Role Labeling
TLDR
A version of graph convolutional networks (GCNs), a recent class of neural networks operating on graphs, suited to model syntactic dependency graphs, is proposed, observing that GCN layers are complementary to LSTM ones. Expand
Graph-based Neural Multi-Document Summarization
TLDR
This model improves upon other traditional graph-based extractive approaches and the vanilla GRU sequence model with no graph, and it achieves competitive results against other state-of-the-art multi-document summarization systems. Expand
Get To The Point: Summarization with Pointer-Generator Networks
TLDR
A novel architecture that augments the standard sequence-to-sequence attentional model in two orthogonal ways, using a hybrid pointer-generator network that can copy words from the source text via pointing, which aids accurate reproduction of information, while retaining the ability to produce novel words through the generator. Expand
...
1
2
3
4
...