Near-lossless Binarization of Word Embeddings

@inproceedings{Tissier2019NearlosslessBO,
  title={Near-lossless Binarization of Word Embeddings},
  author={Julien Tissier and Amaury Habrard and Christophe Gravier},
  booktitle={AAAI},
  year={2019}
}
  • Julien Tissier, Amaury Habrard, Christophe Gravier
  • Published in AAAI 2019
  • Computer Science
  • Word embeddings are commonly used as a starting point in many NLP models to achieve state-of-the-art performances. However, with a large vocabulary and many dimensions, these floating-point representations are expensive both in terms of memory and calculations which makes them unsuitable for use on low-resource devices. The method proposed in this paper transforms real-valued embeddings into binary embeddings while preserving semantic information, requiring only 128 or 256 bits for each vector… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    6
    Twitter Mentions

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 30 REFERENCES

    SimLex-999: Evaluating Semantic Models With (Genuine) Similarity Estimation

    VIEW 2 EXCERPTS
    HIGHLY INFLUENTIAL

    Effective Dimensionality Reduction for Word Embeddings

    VIEW 1 EXCERPT