On the Importance of Word and Sentence Representation Learning in Implicit Discourse Relation Classification

@article{Liu2020OnTI,
  title={On the Importance of Word and Sentence Representation Learning in Implicit Discourse Relation Classification},
  author={Xin Liu and Jiefu Ou and Yangqiu Song and Xin Jiang},
  journal={ArXiv},
  year={2020},
  volume={abs/2004.12617}
}
Implicit discourse relation classification is one of the most difficult parts in shallow discourse parsing as the relation prediction without explicit connectives requires the language understanding at both the text span level and the sentence level. Previous studies mainly focus on the interactions between two arguments. We argue that a powerful contextualized representation module, a bilateral multi-perspective matching module, and a global information fusion module are all important to… 

Figures and Tables from this paper

Implicit Discourse Relation Classification Based on Semantic Graph Attention Networks
TLDR
A semantic graph neural network is proposed to describe the syntactic structure of sentences and semantic interactions between sentence pair and convolutional neural network with different Convolutional kernels to extract the multi-granularity semantic features.
Knowledge Distillation for Discourse Relation Analysis
TLDR
This paper takes the first step to exploit the knowledge distillation (KD) technique for discourse relation analysis and aims to train a focused single-data single-task student with the help of a general multi-data multi-task teacher.
The DISRPT 2021 Shared Task on Elementary Discourse Unit Segmentation, Connective Detection, and Relation Classification
TLDR
The data included in the Shared Task is reviewed, which covers nearly 3 million manually annotated tokens from 16 datasets in 11 languages, and system performance on each task is reported on for both annotated and plain-tokenized versions of the data.
Exploring Discourse Structures for Argument Impact Classification
TLDR
Experimental results and extensive analysis show that the attention and gate mechanisms that explicitly model contexts and texts can indeed help the argument impact classification task defined by Durmus et al. (2019), and discourse structures among the context path of the claim to be classified can further boost the performance.
A Label Dependence-aware Sequence Generation Model for Multi-level Implicit Discourse Relation Recognition
TLDR
This paper considers multi-level IDRR as a conditional label sequence generation task and proposes a Label Dependence-aware Sequence Generation Model (LDSGM) for it, which develops a mutual learning enhanced training method to exploit the label dependence in a bottomup direction.
CVAE-based Re-anchoring for Implicit Discourse Relation Classification
TLDR
This work uses Conditional VAE (CVAE) to estimate the risk of erroneous sampling, and develops a re-anchoring method which migrates the anchor of sampling area of VAE to re-duce the risk.
Deep Discourse Analysis for Generating Personalized Feedback in Intelligent Tutor Systems
TLDR
This work explores creating automated, personalized feedback in an intelligent tutoring system (ITS) and finds that the personalized feedback generated is highly contextual, domain-aware and effectively targets each student’s misconceptions and knowledge gaps.
A Survey of Implicit Discourse Relation Recognition
TLDR
The main solution approaches for the IDRR task are categorized, including their origins, ideas, strengths and weaknesses, and performance comparisons for those solutions experimented on a public corpus with standard data processing procedures are presented.

References

SHOWING 1-10 OF 34 REFERENCES
Next Sentence Prediction helps Implicit Discourse Relation Classification within and across Domains
TLDR
This work shows that using the bidirectional encoder representation from transformers (BERT) proposed by Devlin et al. (2019), which were trained on a next-sentence prediction task, and thus encode a representation of likely next sentences, outperforms the current state of the art in 11-way classification on the standard PDTB dataset.
Deep Enhanced Representation for Implicit Discourse Relation Recognition
TLDR
A model augmented with different grained text representations, including character, subword, word, sentence, and sentence pair levels is proposed, which achieves state-of-the-art accuracy with greater than 48% in 11-way and F1 score greater than 50% in 4-way classifications for the first time.
Discourse Relation Prediction: Revisiting Word Pairs with Convolutional Networks
TLDR
A novel approach of representing the input as word pairs achieves state-of-the-art results on four-way classification of both implicit and explicit relations as well as one of the binary classification tasks.
Implicit Discourse Relation Classification via Multi-Task Neural Networks
TLDR
This work designs related discourse classification tasks specific to a corpus, and proposes a novel Convolutional Neural Network embedded multi-task learning system to synthesize these tasks by learning both unique and shared representations for each task.
Improving Implicit Discourse Relation Classification by Modeling Inter-dependencies of Discourse Units in a Paragraph
TLDR
A paragraph-level neural networks that model inter-dependencies between discourse units as well as discourse relation continuity and patterns, and predict a sequence of discourse relations in a paragraph are introduced.
Recognizing Implicit Discourse Relations via Repeated Reading: Neural Networks with Multi-Level Attention
TLDR
This work proposes the neural networks with multi-level attention (NNMA), combining the attention mechanism and external memories to gradually fix the attention on some specific words helpful to judging the discourse relations.
A Stacking Gated Neural Architecture for Implicit Discourse Relation Classification
TLDR
A stacking neural network model is proposed to solve the classification problem in which a convolutional neural network is utilized for sentence modeling and a collaborative gated neural network (CGNN) is proposed for feature transformation.
Using active learning to expand training data for implicit discourse relation recognition
TLDR
Experimental results show that expanding the training set with small-scale carefully-selected external data yields substantial performance gain, with the improvements of about 4% for accuracy and 3.6% for F-score, which allows a weak classifier to achieve a comparable performance against the state-of-the-art systems.
A Refined End-to-End Discourse Parser
TLDR
This paper describes the discourse parser that participated in the CoNLL-2015 shared task, which uses 9 components to construct the whole parser to identify discourse connectives, label arguments and classify the sense of Explicit or Non-Explicit relations in free texts.
Improving the Inference of Implicit Discourse Relations via Classifying Explicit Discourse Connectives
TLDR
This work investigates the interaction between discourse connectives and the discourse relations and proposes the criteria for selecting the discourse connective that can be dropped independently of the context without changing the interpretation of the discourse.
...
1
2
3
4
...