Tired of Topic Models? Clusters of Pretrained Word Embeddings Make for Fast and Good Topics too!

Suzanna Sia, Ayush Dalmia, Sabrina J. Mielke. Tired of Topic Models? Clusters of Pretrained Word Embeddings Make for Fast and Good Topics too!. In Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu, editors, Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020, Online, November 16-20, 2020. pages 1728-1736, Association for Computational Linguistics, 2020. [doi]

@inproceedings{SiaDM20,
  title = {Tired of Topic Models? Clusters of Pretrained Word Embeddings Make for Fast and Good Topics too!},
  author = {Suzanna Sia and Ayush Dalmia and Sabrina J. Mielke},
  year = {2020},
  url = {https://www.aclweb.org/anthology/2020.emnlp-main.135/},
  researchr = {https://researchr.org/publication/SiaDM20},
  cites = {0},
  citedby = {0},
  pages = {1728-1736},
  booktitle = {Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020, Online, November 16-20, 2020},
  editor = {Bonnie Webber and Trevor Cohn and Yulan He and Yang Liu},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-952148-60-6},
}