Latent Dirichlet allocation (LDA) provides a powerful method for data analysis in machine learning and applied statistics. Parallel LDA algorithms have attracted intensive research interests because big data have become increasingly common in recent years such as billions of tweets, images and videos on the web. This paper introduces PLL, a parallel LDA learning toolbox for big topic modeling. The toolbox includes three main inference algorithms for learning LDA, variational Bayes(VB), collapsed Gibbs sampling (GS) and belief propagation (BP). This toolbox is an ongoing project and more parallel LDA algorithms for various topic models will be added in the near future.
- Changes to previous version:
Fix some compiling errors.
No one has posted any comments yet. Perhaps you'd like to be the first?
Leave a comment
You must be logged in to post comments.