• Corpus ID: 227228213

Improved Semantic Role Labeling using Parameterized Neighborhood Memory Adaptation

@article{Jindal2020ImprovedSR,
  title={Improved Semantic Role Labeling using Parameterized Neighborhood Memory Adaptation},
  author={Ishan Jindal and Ranit Aharonov and Siddhartha Brahma and Huaiyu Zhu and Yunyao Li},
  journal={ArXiv},
  year={2020},
  volume={abs/2011.14459}
}
Deep neural models achieve some of the best results for semantic role labeling. Inspired by instance-based learning that utilizes nearest neighbors to handle low-frequency context-specific training samples, we investigate the use of memory adaptation techniques in deep neural models. We propose a parameterized neighborhood memory adaptive (PNMA) method that uses a parameterized representation of the nearest neighbors of tokens in a memory of activations and makes predictions based on the most… 

Figures and Tables from this paper

Semantic Role Labeling as Dependency Parsing: Exploring Latent Tree Structures Inside Arguments

TLDR
This paper equips the formulation with a novel span-constrained TreeCRF to make tree structures span-aware and further extend it to the second-order case, and achieves new state-of-the-art under both end-to-end and w/ gold predicates settings.

Fast and Accurate End-to-End Span-based Semantic Role Labeling as Word-based Graph Parsing

TLDR
This paper proposes to cast end-to-end span-based SRL as a word-based graph parsing task and proposes and compares four different schemata of graph representation, i.e., BES, BE, BIES, and BII, finding that the BES schema performs the best.

Label Definitions Improve Semantic Role Labeling

TLDR
This model achieves state-of-the-art performance on the CoNLL09 dataset injected with label definitions given the predicate senses, and is even more pronounced in low-resource settings when training data is scarce.

Universal Proposition Bank 2.0

TLDR
This paper introduces Universal Proposition Bank 2.0 (UP2.0), with significant enhancements over UP1.0, including propbanks with higher quality by using a state-of-the-art monolingual SRL and improved auto-generation of annotations; expanded language coverage (from 7 to 23 languages); and span annotation for the decoupling of syntactic analysis.

An MRC Framework for Semantic Role Labeling

TLDR
This paper formalizes predicate disambiguation as multiple-choice machine reading comprehension, where the descriptions of candidate senses of a given predicate are used as options to select the correct sense.

References

SHOWING 1-10 OF 33 REFERENCES

Neural Semantic Role Labeling with Dependency Path Embeddings

TLDR
A novel model for semantic role labeling that makes use of neural sequence modeling techniques and treats complex syntactic structures and related phenomena, such as nested subordinations and nominal predicates, as subsequences of lexicalized dependency paths and learns suitable embedding representations.

Semantic Role Labeling with Associated Memory Network

TLDR
A novel syntax-agnostic SRL model enhanced by the proposed associated memory network (AMN), which makes use of inter-sentence attention of label-known associated sentences as a kind of memory to further enhance dependency-based SRL.

Deep Semantic Role Labeling with Self-Attention

TLDR
This paper presents a simple and effective architecture for SRL which is based on self-attention which can directly capture the relationships between two tokens regardless of their distance and is computationally efficient.

Encoding Sentences with Graph Convolutional Networks for Semantic Role Labeling

TLDR
A version of graph convolutional networks (GCNs), a recent class of neural networks operating on graphs, suited to model syntactic dependency graphs, is proposed, observing that GCN layers are complementary to LSTM ones.

Unbounded cache model for online language modeling with open vocabulary

TLDR
This paper uses a large scale non-parametric memory component that stores all the hidden activations seen in the past and leverages recent advances in approximate nearest neighbor search and quantization algorithms to store millions of representations while searching them efficiently.

Deep Semantic Role Labeling: What Works and What's Next

We introduce a new deep learning model for semantic role labeling (SRL) that significantly improves the state of the art, along with detailed analyses to reveal its strengths and limitations. We use

End-to-end learning of semantic role labeling using recurrent neural networks

TLDR
This work proposes to use deep bi-directional recurrent network as an end-to-end system for SRL, which takes only original text information as input feature, without using any syntactic knowledge.

A Simple and Accurate Syntax-Agnostic Neural Model for Dependency-based Semantic Role Labeling

TLDR
A simple and accurate neural model for dependency-based semantic role labeling that predicts predicate-argument dependencies relying on states of a bidirectional LSTM encoder that substantially outperforms all previous local models and approaches the best reported results on the English CoNLL-2009 dataset.

Jointly Predicting Predicates and Arguments in Neural Semantic Role Labeling

TLDR
This work proposes an end-to-end approach for jointly predicting all predicates, arguments spans, and the relations between them, and makes independent decisions about what relationship, if any, holds between every possible word-span pair.

Linguistically-Informed Self-Attention for Semantic Role Labeling

TLDR
LISA is a neural network model that combines multi-head self-attention with multi-task learning across dependency parsing, part-of-speech tagging, predicate detection and SRL, and can incorporate syntax using merely raw tokens as input.