Multi-sense embeddings through a word sense disambiguation process

@article{Ruas2019MultisenseET,
  title={Multi-sense embeddings through a word sense disambiguation process},
  author={Terry Ruas and William I. Grosky and Akiko Aizawa},
  journal={ArXiv},
  year={2019},
  volume={abs/2101.08700}
}

Figures and Tables from this paper

TensSent: a tensor based sentimental word embedding method
TLDR
This study proposes two novel unsupervised models to integrating word polarity information and word co-occurrences as more tailored features for sentiment analysis.
Graph Convolutional Network for Word Sense Disambiguation
TLDR
Experimental results show that average accuracy of the proposed WSD method is improved, and the softmax function is applied to determine the semantic category of the ambiguous word.
Measuring associational thinking through word embeddings
TLDR
The research concludes that the weighted average of the cosine-similarity coefficients derived from independent word embeddings in a double vector space tends to yield high correlations with human judgements.
Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles
TLDR
This paper model the problem of finding the relationship between two documents as a pairwise document classification task, and suggests that classifying semantic relations between documents is a solvable task and motivates the development of a recommender system based on the evaluated techniques.
Age of Exposure 2.0: Estimating word complexity using iterative models of word embeddings.
Age of acquisition (AoA) is a measure of word complexity which refers to the age at which a word is typically learned. AoA measures have shown strong correlations with reading comprehension, lexical
Evolution of Semantic Similarity—A Survey
TLDR
This survey article traces the evolution of semantic similarity methods beginning from traditional NLP techniques such as kernel-based methods to the most recent research work on transformer-based models, categorizing them based on their underlying principles as knowledge-based, corpus- based, deep neural network–based methods, and hybrid methods.
Vec2Dynamics: A Temporal Word Embedding Approach to Exploring the Dynamics of Scientific Keywords - Machine Learning as a Case Study
TLDR
This paper proposes Vec2Dynamics, a neural-based computational history approach that reports stability of k-nearest neighbors of scientific keywords over time; the stability indicates whether the keywords are taking new neighborhood due to evolution of scientific literature.
...
...

References

SHOWING 1-10 OF 90 REFERENCES
Embeddings for Word Sense Disambiguation: An Evaluation Study
TLDR
This work proposes different methods through which word embeddings can be leveraged in a state-of-the-art supervised WSD system architecture, and performs a deep analysis of how different parameters affect performance.
Do Multi-Sense Embeddings Improve Natural Language Understanding?
TLDR
A multisense embedding model based on Chinese Restaurant Processes is introduced that achieves state of the art performance on matching human word similarity judgments, and a pipelined architecture for incorporating multi-sense embeddings into language understanding is proposed.
Embedding Words and Senses Together via Joint Knowledge-Enhanced Training
TLDR
This work proposes a new model which learns word and sense embeddings jointly and exploits large corpora and knowledge from semantic networks in order to produce a unified vector space of word and senses.
SensEmbed: Learning Sense Embeddings for Word and Relational Similarity
TLDR
This work proposes a multifaceted approach that transforms word embeddings to the sense level and leverages knowledge from a large semantic network for effective semantic similarity measurement.
Improving Distributed Representation of Word Sense via WordNet Gloss Composition and Context Clustering
TLDR
The learned represen-tations outperform the publicly available embeddings on 2 out of 4 metrics in the word similarity task, and 6 out of 13 sub tasks in the analogical reasoning task.
Simple Embedding-Based Word Sense Disambiguation
TLDR
A knowledge-based WSD method that uses word and sense embeddings to compute the similarity between the gloss of a sense and the context of the word and the results show that by lexically extending the amount of words in the gloss and context, although it works well for other implementations of Lesk, harms this method.
Entity Linking meets Word Sense Disambiguation: a Unified Approach
TLDR
Babelfy is presented, a unified graph-based approach to EL and WSD based on a loose identification of candidate meanings coupled with a densest subgraph heuristic which selects high-coherence semantic interpretations.
De-Conflated Semantic Representations
TLDR
This work proposes a technique that tackles semantic representation problems by de-conflating the representations of words based on the deep knowledge it derives from a semantic network, including its high coverage and the ability to generate accurate representations even for infrequent word senses.
A Unified Model for Word Sense Representation and Disambiguation
TLDR
A unified model for joint word sense representation and disambiguation, which will assign distinct representations for each word sense and improves the performance of contextual word similarity compared to existing WSR methods, outperforms state-of-the-art supervised methods on domainspecific WSD, and achieves competitive performance on coarse-grained all-words WSD.
...
...