Deep Semantic Role Labeling: What Works and What's Next

  title={Deep Semantic Role Labeling: What Works and What's Next},
  author={Luheng He and Kenton Lee and Mike Lewis and Luke Zettlemoyer},
We introduce a new deep learning model for semantic role labeling (SRL) that significantly improves the state of the art, along with detailed analyses to reveal its strengths and limitations. [] Key Result Extensive empirical analysis of these gains show that (1) deep models excel at recovering long-distance dependencies but can still make surprisingly obvious errors, and (2) that there is still room for syntactic parsers to improve these results.

Deep Semantic Role Labeling with Self-Attention

This paper presents a simple and effective architecture for SRL which is based on self-attention which can directly capture the relationships between two tokens regardless of their distance and is computationally efficient.

Edinburgh Research Explorer Semantic Role Labeling with Iterative Structure Refinement

This work encodes prior knowledge about the SRL problem by designing a restricted network architecture capturing non-local interactions, and results in an effective model, outperforming strong factorized baseline models on all 7 CoNLL-2009 languages, and achieving state-of-the-art results on 5 of them, including English.

Syntax-aware Semantic Role Labeling without Parsing

The backbone of the model is an LSTM-based semantic role labeler jointly trained with two auxiliary tasks: predicting the dependency label of a word and whether there exists an arc linking it to the predicate.

A Joint Sequential and Relational Model for Frame-Semantic Parsing

This work introduces a new method for frame-semantic parsing that significantly outperforms existing neural and non-neural approaches, achieving a 5.7 F1 gain over the current state of the art, for full frame structure extraction.

Syntax for Semantic Role Labeling, To Be, Or Not To Be

This model achieves state-of-the-art results on the CoNLL-2008, 2009 benchmarks for both English and Chinese, showing the quantitative significance of syntax to neural SRL together with a thorough empirical survey over existing models.

Syntax-Enhanced Self-Attention-Based Semantic Role Labeling

The experiment results demonstrate that with proper incorporation of the high quality syntactic information, the model achieves a new state-of-the-art performance for the Chinese SRL task on the CoNLL-2009 dataset.

Predicate Informed Syntax-Guidance for Semantic Role Labeling

This work offers a simple BERT-based model which shows that, contrary to the popular belief that more complexity means better performance, removing LSTM improves the state of the art for span-based semantic role labeling.

Explicit Contextual Semantics for Text Comprehension

This paper makes the first attempt to let SRL enhance text comprehension and inference through specifying verbal predicates and their corresponding semantic roles, and shows that the salient labels can be conveniently added to existing models and significantly improve deep learning models in challenging text comprehension tasks.

Second-Order Semantic Role Labeling With Global Structural Refinement

This paper explores a second-order end-to-end SRL model that considers simultaneously two pairs of predicate-argument when making scoring, and proposes a structural refinement mechanism to further model higher-order interactions at a global scope.

Semantic Role Labeling For Russian Language Based on Ensemble Model

Bidirectional Gated Recurrent Unit and attention mechanism are respectively used to extract the potential features of arguments which are prepared to be classified, and then they input above features and the basic features of the argument into a neural network to identify its role.



A Simple and Accurate Syntax-Agnostic Neural Model for Dependency-based Semantic Role Labeling

A simple and accurate neural model for dependency-based semantic role labeling that predicts predicate-argument dependencies relying on states of a bidirectional LSTM encoder that substantially outperforms all previous local models and approaches the best reported results on the English CoNLL-2009 dataset.

Neural Semantic Role Labeling with Dependency Path Embeddings

A novel model for semantic role labeling that makes use of neural sequence modeling techniques and treats complex syntactic structures and related phenomena, such as nested subordinations and nominal predicates, as subsequences of lexicalized dependency paths and learns suitable embedding representations.

Semantic Role Labeling with Neural Network Factors

A new method for semantic role labeling in which arguments and semantic roles are jointly embedded in a shared vector space for a given predicate, which is based on a neural network designed for the SRL task.

A Global Joint Model for Semantic Role Labeling

We present a model for semantic role labeling that effectively captures the linguistic intuition that a semantic argument frame is a joint structure, with strong dependencies among the arguments. We

The Importance of Syntactic Parsing and Inference in Semantic Role Labeling

It is shown that full syntactic parsing information is, by far, most relevant in identifying the argument, especially in the very first stagethe pruning stage, and an effective and simple approach of combining different semantic role labeling systems through joint inference is proposed, which significantly improves its performance.

End-to-end learning of semantic role labeling using recurrent neural networks

This work proposes to use deep bi-directional recurrent network as an end-to-end system for SRL, which takes only original text information as input feature, without using any syntactic knowledge.

Efficient Inference and Structured Learning for Semantic Role Labeling

A dynamic programming algorithm for efficient constrained inference in semantic role labeling that tractably captures a majority of the structural constraints examined by prior work in this area, and allows training a globally-normalized log-linear model with respect to constrained conditional likelihood.

Chinese Semantic Role Labeling with Bidirectional Recurrent Neural Networks

Bidirectional recurrent neural network with long-short-term memory (LSTM) with RNN to capture bidirectional and long-range dependencies in a sentence with minimal feature engineering is introduced.

Generalized Inference with Multiple Semantic Role Labeling Systems

We present an approach to semantic role labeling (SRL) that takes the output of multiple argument classifiers and combines them into a coherent predicate-argument output by solving an optimization

Global Neural CCG Parsing with Optimality Guarantees

This work introduces the first global recursive neural parsing model with optimality guarantees during decoding, and shows it is possible to learn an efficient A* parser.