Corpus ID: 18332428

Not All Neural Embeddings are Born Equal

@article{Hill2014NotAN,
  title={Not All Neural Embeddings are Born Equal},
  author={Felix Hill and Kyunghyun Cho and S{\'e}bastien Jean and Coline Devin and Yoshua Bengio},
  journal={ArXiv},
  year={2014},
  volume={abs/1410.0718}
}
Neural language models learn word representations that capture rich linguistic and conceptual information. Here we investigate the embeddings learned by neural machine translation models. We show that translation-based embeddings outperform those learned by cutting-edge monolingual models at single-language tasks requiring knowledge of conceptual similarity and/or syntactic role. The findings suggest that, while monolingual models learn information about how concepts are related, neural… Expand
Wiktionary-Based Word Embeddings
Learning Word Meta-Embeddings by Using Ensembles of Embedding Sets
Learning Meta-Embeddings by Using Ensembles of Embedding Sets
Learning Word Meta-Embeddings
Learning Word Meta-Embeddings by Autoencoding
Word and Document Embeddings based on Neural Network Approaches
Neural Text Embeddings for Information Retrieval
Think Globally, Embed Locally - Locally Linear Meta-embedding of Words
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 20 REFERENCES
Dependency-Based Word Embeddings
A unified architecture for natural language processing: deep neural networks with multitask learning
Glove: Global Vectors for Word Representation
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
Multimodal Distributional Semantics
Neural Machine Translation by Jointly Learning to Align and Translate
A Neural Probabilistic Language Model
...
1
2
...