Dynamic Word Embeddings

@inproceedings{Bamler2017DynamicWE,
  title={Dynamic Word Embeddings},
  author={Robert Bamler and Stephan Mandt},
  booktitle={ICML},
  year={2017}
}
We present a probabilistic language model for time-stamped text data which tracks the semantic evolution of individual words over time. The model represents words and contexts by latent trajectories in an embedding space. At each moment in time, the embedding vectors are inferred from a probabilistic version of word2vec (Mikolov et al., 2013b). These embedding vectors are connected in time through a latent diffusion process. We describe two scalable variational inference algorithms—skipgram… CONTINUE READING
Highly Cited
This paper has 27 citations. REVIEW CITATIONS
21 Citations
42 References
Similar Papers

Similar Papers

Loading similar papers…