Near-lossless Binarization of Word Embeddings

@inproceedings{Tissier2018NearlosslessBO,
  title={Near-lossless Binarization of Word Embeddings},
  author={Julien Tissier and Amaury Habrard and Christophe Gravier},
  booktitle={AAAI},
  year={2018}
}
Word embeddings are commonly used as a starting point in many NLP models to achieve state-of-the-art performances. However, with a large vocabulary and many dimensions, these floating-point representations are expensive both in terms of memory and calculations which makes them unsuitable for use on low-resource devices. The method proposed in this paper transforms real-valued embeddings into binary embeddings while preserving semantic information, requiring only 128 or 256 bits for each vector… CONTINUE READING
6
Twitter Mentions

References

Publications referenced by this paper.