A Correlated Topic Model Using Word Embeddings

  title={A Correlated Topic Model Using Word Embeddings},
  author={Guangxu Xun and Yaliang Li and Wayne Xin Zhao and Jing Gao and Aidong Zhang},
Conventional correlated topic models are able to capture correlation structure among latent topics by replacing the Dirichlet prior with the logistic normal distribution. Word embeddings have been proven to be able to capture semantic regularities in language. Therefore, the semantic relatedness and correlations between words can be directly calculated in the word embedding space, for example, via cosine values. In this paper, we propose a novel correlated topic model using word embeddings. The… CONTINUE READING


Publications referenced by this paper.
Showing 1-10 of 36 references

pages 556–563

  • Pengfei Hu, Wenju Liu, Wei Jiang, Zhanlei Yang. Latent topic model based on gaussianlda for Recognition
  • Springer,
  • 2012
Highly Influential
4 Excerpts


  • Yang Liu, Zhiyuan Liu, Tat-Seng Chua, Maosong Sun. Topical word embeddings
  • pages 2418–2424,
  • 2015
2 Excerpts

Similar Papers

Loading similar papers…