Always Keep your Target in Mind: Studying Semantics and Improving Performance of Neural Lexical Substitution

@article{Arefyev2020AlwaysKY,
  title={Always Keep your Target in Mind: Studying Semantics and Improving Performance of Neural Lexical Substitution},
  author={Nikolay Arefyev and Boris Sheludko and A. V. Podolskiy and Alexander Panchenko},
  journal={ArXiv},
  year={2020},
  volume={abs/2206.11815}
}
Lexical substitution, i.e. generation of plausible words that can replace a particular target word in a given context, is an extremely powerful technology that can be used as a backbone of various NLP applications, including word sense induction and disambiguation, lexical relation extraction, data augmentation, etc. In this paper, we present a large-scale comparative study of lexical substitution methods employing both rather old and most recent language and masked language models (LMs and… 

Figures and Tables from this paper

ALaSca: an Automated approach for Large-Scale Lexical Substitution

TLDR
ALaSca, a novel approach to automatically creating large-scale datasets for English lexical substitution, allows examples to be produced for potentially any word in a language vocabulary and to cover most of the meanings it lists, and can unleash the full potential of neural architectures and finetune them on the Lexical substitution task.

GeneSis: A Generative Approach to Substitutes in Context

TLDR
GeneSis (Generating Substitutes in contexts), the first generative approach to lexical substitution, thanks to a seq2seq model, generates substitutes for a word according to the context it appears in, attaining state-of-the-art results on different benchmarks.

Improving Contextual Representation with Gloss Regularized Pre-training

TLDR
This work proposes an auxiliary gloss regularizer module to BERT pre-training (GR-BERT), to enhance word semantic similarity by predicting masked words and aligning contextual embeddings to corresponding glosses simultaneously, so that the word similarity can be explicitly modeled.

black[LSCDiscovery shared task] BOS at LSCDiscovery: Lexical Substitution for Interpretable Lexical Semantic Change Detection

TLDR
This approach achieves the second best result in sense loss and sense gain detection subtasks by observing those substitutes that are specific for only one time period, which allows providing more detailed information about semantic change to the user.

Text Detoxification using Large Pre-trained Neural Models

TLDR
Two novel unsupervised methods for eliminating toxicity in text using BERT to replace toxic words with their non-offensive synonyms and the first large-scale comparative study of style transfer models on the task of toxicity removal are presented.

Methods for Detoxification of Texts for the Russian Language

TLDR
The first study of the automatic detoxification of Russian texts to combat offensive language is introduced and it is suggested that two types of models based on BERT architecture that performs local corrections and a supervised approach based on a pretrained GPT-2 language model can be used for detoxification.

A Comparison of Strategies for Source-Free Domain Adaptation

TLDR
It is found that active learning yields consistent gains across all SemEval 2021 Task 10 tasks and domains, but though the shared task saw successful self-trained and data augmented models, a systematic comparison finds these strategies to be unreliable for source-free domain adaptation.

Unsupervised Lexical Substitution with Decontextualised Embeddings

We propose a new unsupervised method for lexical substitution using pre-trained language models. Compared to previous approaches that use the generative capability of language models to predict

References

SHOWING 1-10 OF 27 REFERENCES

Combining Lexical Substitutes in Neural Word Sense Induction

TLDR
This work improves the approach to WSI proposed by Amrami and Goldberg (2018) based on clustering of lexical substitutes for an ambiguous word in a particular context obtained from neural language models by proposing methods for combining information from left and right context and similarity to the ambiguous word.

BERT-based Lexical Substitution

TLDR
This work proposes an end-to-end BERT-based lexical substitution approach which can propose and validate substitute candidates without using any annotated data or manually curated resources and achieves the state-of-the-art results in both LS07 and LS14 benchmarks.

Supervised All-Words Lexical Substitution using Delexicalized Features

TLDR
A supervised lexical substitution system that does not use separate classifiers per word and is therefore applicable to any word in the vocabulary is proposed, which improves over the state of the art in the LexSub Best-Precision metric and the Generalized Average Precision measure.

A Simple Word Embedding Model for Lexical Substitution

TLDR
A simple model for lexical substitution, based on the popular skip-gram word embedding model, which is efficient, very simple to implement, and at the same time achieves state-ofthe-art results in an unsupervised setting.

Language Transfer Learning for Supervised Lexical Substitution

TLDR
This work combines state-of-the-art unsupervised features obtained from syntactic word embeddings and distributional thesauri in a supervised delexicalized ranking system and shows that a supervised system can be trained effectively, even if training and evaluation data are from different languages.

Towards better substitution-based word sense induction

TLDR
This work extends the previous method to support a dynamic rather than a fixed number of clusters as supported by other prominent methods, and proposes a method for interpreting the resulting clusters by associating them with their most informative substitutes.

A Comparison of Context-sensitive Models for Lexical Substitution

TLDR
It is shown that powerful contextualized word representations, which give high performance in several semantics-related tasks, deal less well with the subtle in-context similarity relationships needed for substitution.

Word Sense Induction with Neural biLM and Symmetric Patterns

TLDR
The combination of the RNN-LM and the dynamic symmetric patterns results in strong substitute vectors for WSI, allowing to surpass the current state-of-the-art on the SemEval 2013 WSI shared task by a large margin.

Deep Contextualized Word Representations

TLDR
A new type of deep contextualized word representation is introduced that models both complex characteristics of word use and how these uses vary across linguistic contexts, allowing downstream models to mix different types of semi-supervision signals.

Learning to Rank Lexical Substitutions

TLDR
This paper customize and evaluate several learning-to-rank models to the lexical substitution task, including classification-based and regression-based approaches, and finds that the best models significantly advance the state-of-the-art.