Deep Semantic Role Labeling: What Works and What's Next

@inproceedings{He2017DeepSR,
  title={Deep Semantic Role Labeling: What Works and What's Next},
  author={Luheng He and Kenton Lee and Mike Lewis and Luke Zettlemoyer},
  booktitle={ACL},
  year={2017}
}
We introduce a new deep learning model for semantic role labeling (SRL) that significantly improves the state of the art, along with detailed analyses to reveal its strengths and limitations. [...] Key Result Extensive empirical analysis of these gains show that (1) deep models excel at recovering long-distance dependencies but can still make surprisingly obvious errors, and (2) that there is still room for syntactic parsers to improve these results.Expand
Deep Semantic Role Labeling with Self-Attention
TLDR
This paper presents a simple and effective architecture for SRL based on self-attention which can directly capture the relationships between two tokens regardless of their distance and is computationally efficient. Expand
Semantic Role Labeling as Dependency Parsing: Exploring Latent Tree Structures Inside Arguments
  • Yu Zhang, Qingrong Xia, +4 authors Min Zhang
  • Computer Science
  • ArXiv
  • 2021
TLDR
This work proposes to reduce SRL to a dependency parsing task and regard the flat argument spans as latent subtrees, and equip the formulation with a novel span-constrained TreeCRF model to make tree structures spanaware, and extends it to the secondorder case. Expand
Syntax-aware Semantic Role Labeling without Parsing
TLDR
The backbone of the model is an LSTM-based semantic role labeler jointly trained with two auxiliary tasks: predicting the dependency label of a word and whether there exists an arc linking it to the predicate. Expand
A Joint Sequential and Relational Model for Frame-Semantic Parsing
TLDR
This work introduces a new method for frame-semantic parsing that significantly outperforms existing neural and non-neural approaches, achieving a 5.7 F1 gain over the current state of the art, for full frame structure extraction. Expand
Syntax for Semantic Role Labeling, To Be, Or Not To Be
TLDR
This model achieves state-of-the-art results on the CoNLL-2008, 2009 benchmarks for both English and Chinese, showing the quantitative significance of syntax to neural SRL together with a thorough empirical survey over existing models. Expand
Syntax-Enhanced Self-Attention-Based Semantic Role Labeling
TLDR
The experiment results demonstrate that with proper incorporation of the high quality syntactic information, the model achieves a new state-of-the-art performance for the Chinese SRL task on the CoNLL-2009 dataset. Expand
Explicit Contextual Semantics for Text Comprehension
TLDR
This paper makes the first attempt to let SRL enhance text comprehension and inference through specifying verbal predicates and their corresponding semantic roles, and shows that the salient labels can be conveniently added to existing models and significantly improve deep learning models in challenging text comprehension tasks. Expand
Second-Order Semantic Role Labeling With Global Structural Refinement
TLDR
This paper explores a second-order end-to-end SRL model that considers simultaneously two pairs of predicate-argument when making scoring, and proposes a structural refinement mechanism to further model higher-order interactions at a global scope. Expand
Semantic Role Labeling For Russian Language Based on Ensemble Model
TLDR
Bidirectional Gated Recurrent Unit and attention mechanism are respectively used to extract the potential features of arguments which are prepared to be classified, and then they input above features and the basic features of the argument into a neural network to identify its role. Expand
Selectively Connected Self-Attentions for Semantic Role Labeling
Semantic role labeling is an effective approach to understand underlying meanings associated with word relationships in natural language sentences. Recent studies using deep neural networks,Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 31 REFERENCES
A Simple and Accurate Syntax-Agnostic Neural Model for Dependency-based Semantic Role Labeling
TLDR
A simple and accurate neural model for dependency-based semantic role labeling that predicts predicate-argument dependencies relying on states of a bidirectional LSTM encoder that substantially outperforms all previous local models and approaches the best reported results on the English CoNLL-2009 dataset. Expand
Neural Semantic Role Labeling with Dependency Path Embeddings
TLDR
A novel model for semantic role labeling that makes use of neural sequence modeling techniques and treats complex syntactic structures and related phenomena, such as nested subordinations and nominal predicates, as subsequences of lexicalized dependency paths and learns suitable embedding representations. Expand
Semantic Role Labeling with Neural Network Factors
TLDR
A new method for semantic role labeling in which arguments and semantic roles are jointly embedded in a shared vector space for a given predicate, which is based on a neural network designed for the SRL task. Expand
A Global Joint Model for Semantic Role Labeling
We present a model for semantic role labeling that effectively captures the linguistic intuition that a semantic argument frame is a joint structure, with strong dependencies among the arguments. WeExpand
The Importance of Syntactic Parsing and Inference in Semantic Role Labeling
TLDR
It is shown that full syntactic parsing information is, by far, most relevant in identifying the argument, especially in the very first stagethe pruning stage, and an effective and simple approach of combining different semantic role labeling systems through joint inference is proposed, which significantly improves its performance. Expand
End-to-end learning of semantic role labeling using recurrent neural networks
TLDR
This work proposes to use deep bi-directional recurrent network as an end-to-end system for SRL, which takes only original text information as input feature, without using any syntactic knowledge. Expand
Efficient Inference and Structured Learning for Semantic Role Labeling
TLDR
A dynamic programming algorithm for efficient constrained inference in semantic role labeling that tractably captures a majority of the structural constraints examined by prior work in this area, and allows training a globally-normalized log-linear model with respect to constrained conditional likelihood. Expand
Chinese Semantic Role Labeling with Bidirectional Recurrent Neural Networks
TLDR
Bidirectional recurrent neural network with long-short-term memory (LSTM) with RNN to capture bidirectional and long-range dependencies in a sentence with minimal feature engineering is introduced. Expand
Generalized Inference with Multiple Semantic Role Labeling Systems
We present an approach to semantic role labeling (SRL) that takes the output of multiple argument classifiers and combines them into a coherent predicate-argument output by solving an optimizationExpand
Global Neural CCG Parsing with Optimality Guarantees
TLDR
This work introduces the first global recursive neural parsing model with optimality guarantees during decoding, and shows it is possible to learn an efficient A* parser. Expand
...
1
2
3
4
...