Corpus ID: 1685679

DISSECT - DIStributional SEmantics Composition Toolkit

@inproceedings{Dinu2013DISSECTD,
  title={DISSECT - DIStributional SEmantics Composition Toolkit},
  author={Georgiana Dinu and Nghia The Pham and Marco Baroni},
  booktitle={ACL},
  year={2013}
}
We introduce DISSECT, a toolkit to build and explore computational models of word, phrase and sentence meaning based on the principles of distributional semantics. The toolkit focuses in particular on compositional meaning, and implements a number of composition methods that have been proposed in the literature. Furthermore, DISSECT can be useful to researchers and practitioners who need models of word meaning (without composition) as well, as it supports various methods to construct… Expand

Figures, Tables, and Topics from this paper

Dissecting the Practical Lexical Function Model for Compositional Distributional Semantics
TLDR
An inconsistency in PLF between the objective function at training and the prediction at testing which leads to an overcounting of the predicate’s contribution to the meaning of the phrase is identified. Expand
Lexical Substitution for Evaluating Compositional Distributional Models
TLDR
This paper argues for lexical substitution (LexSub) as a means to evaluate CDSMs, and creates a LexSub dataset for CDSM evaluation from a corpus with manual “all-words” LexSub annotation. Expand
Predictability of Distributional Semantics in Derivational Word Formation
TLDR
This paper uses linear regression models to analyze CDSM performance and obtain insights into the linguistic factors that influence how predictable the distributional context of a derived word is going to be, notably part of speech, argument structure, and semantic regularity. Expand
A Generalisation of Lexical Functions for Composition in Distributional Semantics
TLDR
This work carries out a large-scale evaluation, comparing different composition methods within the distributional framework for the cases of both adjective-noun and noun- noun composition, and proposes a novel method focomposition, which generalises the approach by Baroni and Zamparelli (2010). Expand
Fish Transporters and Miracle Homes: How Compositional Distributional Semantics can Help NP Parsing
In this work, we argue that measures that have been shown to quantify the degree of semantic plausibility of phrases, as obtained from their compositionally-derived distributional semanticExpand
Indra: A Word Embedding and Semantic Relatedness Server
TLDR
INDRA is described, a multi-lingual word embedding/distributional semantics framework which supports the creation, use and evaluation of word embedded models and shares more than 65 pre-computed models in 14 languages. Expand
Compositional Distributional Semantics Models in Chunk-based Smoothed Tree Kernels
TLDR
The chunk-based smoothed tree kernels (CSTKs) are proposed as a way to exploit the syntactic structures as well as the reliability of these compositional models for simple phrases. Expand
Predicting the Compositionality of Nominal Compounds: Giving Word Embeddings a Hard Time
TLDR
A large-scale multilingual evaluation of DSMs for predicting the degree of semantic compositionality of nominal compounds on 4 datasets for English and French shows a high correlation with human judgments, being comparable to or outperforming the state of the art for some datasets. Expand
Towards Syntax-aware Compositional Distributional Semantic Models
TLDR
It is shown that under a suitable regime these two approaches can be regarded as the same and, thus, structural information and distributional semantics can successfully cooperate in CSDMs for NLP tasks. Expand
Reverse-engineering Language: A Study on the Semantic Compositionality of German Compounds
TLDR
This paper analyzes the performance of different composition models on a large dataset of German compound nouns and presents a new, simple model that achieves the best performance on the authors' dataset. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 19 REFERENCES
Estimating Linear Models for Compositional Distributional Semantics
TLDR
This paper proposes a novel approach to estimate parameters for a class of compositional distributional models: the additive models, and demonstrates that this approach outperforms existing methods for determining a good model for compositional distributionsal semantics. Expand
Compositional-ly Derived Representations of Morphologically Complex Words in Distributional Semantics
TLDR
This work adapts compositional methods originally developed for phrases to the task of deriving the distributional meaning of morphologically complex words from their parts, and demonstrates the usefulness of a compositional morphology component in distributional semantics. Expand
Composition in Distributional Models of Semantics
TLDR
This article proposes a framework for representing the meaning of word combinations in vector space in terms of additive and multiplicative functions, and introduces a wide range of composition models that are evaluated empirically on a phrase similarity task. Expand
Contextualizing Semantic Representations Using Syntactically Enriched Vector Models
TLDR
A syntactically enriched vector model that supports the computation of contextualized semantic representations in a quasi compositional fashion is presented and substantially outperforms previous work on a paraphrase ranking task and achieves promising results on a wordsense similarity task. Expand
A Structured Vector Space Model for Word Meaning in Context
TLDR
A novel structured vector space model is presented that makes it possible to integrate syntax into the computation of word meaning in context and performs at and above the state of the art for modeling the contextual adequacy of paraphrases. Expand
Multi-Step Regression Learning for Compositional Distributional Semantics
TLDR
It is argued in the analysis that the nature of this learning method also renders it suitable for solving more subtle problems compositional distributional models might face, and is found to outperform existing leading methods. Expand
A Regression Model of Adjective-Noun Compositionality in Distributional Semantics
In this paper we explore the computational modelling of compositionality in distributional models of semantics. In particular, we model the semantic composition of pairs of adjacent EnglishExpand
A comparison of models of word meaning in context
TLDR
It is shown that the models are essentially equivalent if syntactic information is ignored, and that the substantial performance differences previously reported disappear to a large extent when these simplified variants are evaluated under identical conditions. Expand
Semantic Compositionality through Recursive Matrix-Vector Spaces
TLDR
A recursive neural network model that learns compositional vector representations for phrases and sentences of arbitrary syntactic type and length and can learn the meaning of operators in propositional logic and natural language is introduced. Expand
Word Meaning in Context: A Simple and Effective Vector Model
TLDR
A model that represents word meaning in context by vectors which are modified according to the words in the target’s syntactic context, which outperforms all previous models on a word sense disambiguation task. Expand
...
1
2
...