转: http://www.blogbus.com/krischow-logs/65749376.html

 
PLSA note
大家最好把这些课件ppt都看了
---------------------------------------------
模型基础篇
---------------------------------------------
ChengXiang Zhai, Atulya Velivelli, Bei Yu, A cross-collection mixture model for comparative text mining
这篇论文是之后很多的论文的具体应用,其中它提出来的第一个简单模型,配上先验信息的使用,是后面很多论文的一个套路。

Yue Lu, ChengXiang Zhai. Opinion Integration Through Semi-supervised Topic Modeling
这篇论文是上面那个论文的一个应用,但是公式推导极为清晰
---------------------------------------------
模型变种篇
Qiaozhu Mei, Xu Ling, Matthew Wondra, Hang Su, ChengXiang Zhai, Topic Sentiment Mixture: Modeling Facets and Opinions in Weblogs
把这个模型看懂了,那么PLSA之类的topic model,你算是过关了。
---------------------------------------------
EM进化篇
Tao Tao, ChengXiang Zhai, Regularized Estimation of Mixture Models for Robust Pseudo-Relevance Feedback
对EM感兴趣的同学可以尝试看这篇论文
---------------------------------------------
不多说,人家有论文为证:

Yue Lu, ChengXiang Zhai, Neel Sundaresan, Rated Aspect Summarization of Short Comments
Maryam Karimzadehgan, ChengXiang Zhai, Geneva Belford, Multi-Aspect Expertise Matching for Review Assignment
Deng Cai, Qiaozhu Mei, Jiawei Han, ChengXiang Zhai, Modeling Hidden Topics on Document Manifold
Yue Lu, ChengXiang Zhai. Opinion Integration Through Semi-supervised Topic Modeling
Qiaozhu Mei, Deng Cai, Duo Zhang, ChengXiang Zhai. Topic Modeling with Network Regularization
Qiaozhu Mei, Xuehua Shen, and ChengXiang Zhai, Automatic Labeling of Multinomial Topic Models
Qiaozhu Mei, Xu Ling, Matthew Wondra, Hang Su, ChengXiang Zhai, Topic Sentiment Mixture: Modeling Facets and Opinions in Weblogs
Tao Tao, ChengXiang Zhai, Regularized Estimation of Mixture Models for Robust Pseudo-Relevance Feedback
ChengXiang Zhai, Atulya Velivelli, Bei Yu, A cross-collection mixture model for comparative text mining
----------------------------------------------

就写这么多了,下次介绍LDA的应用

相关文章:

  • 2021-04-07
  • 2021-10-11
  • 2022-01-05
  • 2022-12-23
  • 2021-06-07
  • 2021-09-13
  • 2021-10-23
  • 2022-12-23
猜你喜欢
  • 2021-07-21
  • 2022-12-23
  • 2021-06-26
  • 2021-09-14
  • 2021-05-07
  • 2021-07-02
相关资源
相似解决方案