Corpus ID: 211990327

HyperEmbed: Tradeoffs Between Resources and Performance in NLP Tasks with Hyperdimensional Computing enabled Embedding of n-gram Statistics

@article{Alonso2020HyperEmbedTB,
  title={HyperEmbed: Tradeoffs Between Resources and Performance in NLP Tasks with Hyperdimensional Computing enabled Embedding of n-gram Statistics},
  author={Pedro Alonso and K. Shridhar and Denis Kleyko and E. Osipov and M. Liwicki},
  journal={ArXiv},
  year={2020},
  volume={abs/2003.01821}
}
Recent advances in Deep Learning have led to a significant performance increase on several NLP tasks, however, the models become more and more computationally demanding. Therefore, this paper tackles the domain of computationally efficient algorithms for NLP tasks. In particular, it investigates distributed representations of n-gram statistics of texts. The representations are formed using hyperdimensional computing enabled embedding. These representations then serve as features, which are used… Expand
3 Citations
End to End Binarized Neural Networks for Text Classification
  • 1
  • PDF
Classification Using Hyperdimensional Computing: A Review
  • Lulu Ge, K. Parhi
  • Computer Science, Engineering
  • IEEE Circuits and Systems Magazine
  • 2020
  • 6
  • PDF

References

SHOWING 1-10 OF 30 REFERENCES
Subword Semantic Hashing for Intent Classification on Small Datasets
  • 14
  • PDF
Text Classification with Few Examples using Controlled Generalization
  • 8
  • Highly Influential
  • PDF
Handling Massive N-Gram Datasets Efficiently
  • 6
  • PDF
Enriching Word Vectors with Subword Information
  • 4,603
  • PDF
Learning deep structured semantic models for web search using clickthrough data
  • 1,138
  • PDF
Efficient Estimation of Word Representations in Vector Space
  • 17,077
  • Highly Influential
  • PDF
Neural Random Projections for Language Modelling
  • 3
  • PDF
...
1
2
3
...