Multilingual Models for Compositional Distributed Semantics

@article{Hermann2014MultilingualMF,
  title={Multilingual Models for Compositional Distributed Semantics},
  author={K. Hermann and P. Blunsom},
  journal={ArXiv},
  year={2014},
  volume={abs/1404.4641}
}
We present a novel technique for learning semantic representations, which extends the distributional hypothesis to multilingual data and joint-space embeddings. Our models leverage parallel data and learn to strongly align the embeddings of semantically equivalent sentences, while maintaining sufficient distance between those of dissimilar sentences. The models do not rely on word alignments or any syntactic information and are successfully applied to a number of diverse languages. We extend… Expand
Deep Generative Model for Joint Alignment and Word Representation
Massively Multilingual Sparse Word Representations
A Multi-task Approach to Learning Multilingual Representations
Learning Joint Multilingual Sentence Representations with Neural Machine Translation
Exploring Implicit Semantic Constraints for Bilingual Word Embeddings
A Distribution-based Model to Learn Bilingual Word Embeddings
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 42 REFERENCES
Inducing Crosslingual Distributed Representations of Words
Polyglot: Distributed Word Representations for Multilingual NLP
The Role of Syntax in Vector Space Models of Compositional Semantics
Learning Multilingual Word Representations using a Bag-of-Words Autoencoder
Semantic Compositionality through Recursive Matrix-Vector Spaces
Bilingual Word Embeddings for Phrase-Based Machine Translation
A Structured Vector Space Model for Word Meaning in Context
...
1
2
3
4
5
...