Corpus ID: 32788171

HYBED: HYPERBOLIC NEURAL GRAPH EMBEDDING

@inproceedings{Kolkin2017HYBEDHN,
  title={HYBED: HYPERBOLIC NEURAL GRAPH EMBEDDING},
  author={Nicholas I. Kolkin},
  year={2017}
}
  • Nicholas I. Kolkin
  • Published 2017
  • Neural embeddings have been used with great success in Natural Language Processing (NLP). They provide compact representations that encapsulate word similarity and attain state-of-the-art performance in a range of linguistic tasks. The success of neural embeddings has prompted significant amounts of research into applications in domains other than language. One such domain is graph-structured data, where embeddings of vertices can be learned that encapsulate vertex similarity and improve… CONTINUE READING

    Figures and Tables from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 34 REFERENCES

    node2vec: Scalable Feature Learning for Networks

    VIEW 2 EXCERPTS

    LINE: Large-scale Information Network Embedding

    VIEW 1 EXCERPT

    ITEM2VEC: Neural item embedding for collaborative filtering

    • Oren Barkan, Noam Koenigstein
    • Mathematics, Computer Science
    • 2016 IEEE 26th International Workshop on Machine Learning for Signal Processing (MLSP)
    • 2016
    VIEW 1 EXCERPT

    Distributed Representations of Words and Phrases and their Compositionality

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL