“Killing Me” Is Not a Spoiler: Spoiler Detection Model using Graph Neural Networks with Dependency Relation-Aware Attention Mechanism

@inproceedings{Chang2021KillingMI,
  title={“Killing Me” Is Not a Spoiler: Spoiler Detection Model using Graph Neural Networks with Dependency Relation-Aware Attention Mechanism},
  author={Buru Chang and Inggeol Lee and Hyunjae Kim and Jaewoo Kang},
  booktitle={EACL},
  year={2021}
}
Several machine learning-based spoiler detection models have been proposed recently to protect users from spoilers on review websites. Although dependency relations between context words are important for detecting spoilers, current attention-based spoiler detection models are insufficient for utilizing dependency relations. To address this problem, we propose a new spoiler detection model called SDGNN that is based on syntax-aware graph neural networks. In the experiments on two real-world… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 14 REFERENCES
Spoiler alert: Machine learning approaches to detect social media posts with revelatory information
TLDR
An automatic alternative that could alert users when a piece of text contains a spoiler is created, and metadata-based features are developed that substantially improve performance on the spoiler detection task.
Fine-Grained Spoiler Detection from Large-Scale Review Corpora
TLDR
An end-to-end neural network architecture to detect spoiler sentences in review corpora is developed and quantitative and qualitative results demonstrate that the proposed method substantially outperforms existing baselines.
A Deep Neural Spoiler Detection Model Using a Genre-Aware Attention Mechanism
TLDR
A new deep neural spoiler detection model that uses a genre-aware attention mechanism based on the attention mechanism that utilizes genre information for detecting spoilers which vary by genres is proposed.
Finding the Storyteller: Automatic Spoiler Tagging using Linguistic Cues
TLDR
This work develops topic models, based on Latent Dirichlet Allocation (LDA), but using linguistic dependency information in place of simple features from bag of words (BOW) representations, and demonstrates the effectiveness of the technique over four movie-comment datasets of different scales.
Graph Convolution over Pruned Dependency Trees Improves Relation Extraction
TLDR
An extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel is proposed, and a novel pruning strategy is applied to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold.
Spoiler detection in TV program tweets
Graph Convolutional Networks With Argument-Aware Pooling for Event Detection
TLDR
This work investigates a convolutional neural network based on dependency trees to perform event detection and proposes a novel pooling method that relies on entity mentions to aggregate the convolution vectors.
Encoding Sentences with Graph Convolutional Networks for Semantic Role Labeling
TLDR
A version of graph convolutional networks (GCNs), a recent class of neural networks operating on graphs, suited to model syntactic dependency graphs, is proposed, observing that GCN layers are complementary to LSTM ones.
Convolutional Neural Networks for Sentence Classification
TLDR
The CNN models discussed herein improve upon the state of the art on 4 out of 7 tasks, which include sentiment analysis and question classification, and are proposed to allow for the use of both task-specific and static vectors.
GloVe: Global Vectors for Word Representation
TLDR
A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure.
...
...