Corpus ID: 4992154

Learning Word Representations with Hierarchical Sparse Coding

  title={Learning Word Representations with Hierarchical Sparse Coding},
  author={Dani Yogatama and Manaal Faruqui and Chris Dyer and Noah A. Smith},
  • Dani Yogatama, Manaal Faruqui, +1 author Noah A. Smith
  • Published 2015
  • Computer Science, Mathematics
  • ArXiv
  • We propose a new method for learning word representations using hierarchical regularization in sparse coding inspired by the linguistic study of word meanings. We show an efficient learning algorithm based on stochastic proximal methods that is significantly faster than previous approaches, making it possible to perform hierarchical sparse coding on a corpus of billions of word tokens. Experiments on various benchmark tasks---word similarity ranking, analogies, sentence completion, and… CONTINUE READING
    46 Citations
    Sparse Coding of Neural Word Embeddings for Multilingual Sequence Labeling
    • Gábor Berend
    • Computer Science
    • Transactions of the Association for Computational Linguistics
    • 2017
    • 15
    • PDF
    Massively Multilingual Sparse Word Representations
    Clustering of words using dictionary-learnt word representations
    • 8
    • Highly Influenced
    Sparse Bilingual Word Representations for Cross-lingual Lexical Entailment
    • 26
    • PDF
    Sparse Overcomplete Word Vector Representations
    • 132
    • PDF
    Sparsifying Word Representations for Deep Unordered Sentence Modeling
    • 3
    • PDF
    Sparse Lifting of Dense Vectors: Unifying Word and Sentence Representations
    • 3
    • PDF
    Investigating Language Universal and Specific Properties in Word Embeddings
    • 35
    • PDF


    Efficient Estimation of Word Representations in Vector Space
    • 16,456
    • PDF
    Improving Word Representations via Global Context and Multiple Word Prototypes
    • 1,083
    • PDF
    Glove: Global Vectors for Word Representation
    • 16,403
    • PDF
    Distributed Representations of Words and Phrases and their Compositionality
    • 20,483
    • PDF
    Better Word Representations with Recursive Neural Networks for Morphology
    • 704
    • PDF
    A Neural Probabilistic Language Model
    • 4,682
    • PDF
    Learning Effective and Interpretable Semantic Models using Non-Negative Sparse Embedding
    • 140
    • PDF
    Natural Language Processing (Almost) from Scratch
    • 5,793
    • PDF