Pachinko allocation

From HandWiki - Reading time: 3 min

Short description: Statistical tool


In machine learning and natural language processing, the pachinko allocation model (PAM) is a topic model. Topic models are a suite of algorithms to uncover the hidden thematic structure of a collection of documents. [1] The algorithm improves upon earlier topic models such as latent Dirichlet allocation (LDA) by modeling correlations between topics in addition to the word correlations which constitute topics. PAM provides more flexibility and greater expressive power than latent Dirichlet allocation.[2] While first described and implemented in the context of natural language processing, the algorithm may have applications in other fields such as bioinformatics. The model is named for pachinko machines—a game popular in Japan, in which metal balls bounce down around a complex collection of pins until they land in various bins at the bottom.[3]

History

Pachinko allocation was first described by Wei Li and Andrew McCallum in 2006.[3] The idea was extended with hierarchical Pachinko allocation by Li, McCallum, and David Mimno in 2007.[4] In 2007, McCallum and his colleagues proposed a nonparametric Bayesian prior for PAM based on a variant of the hierarchical Dirichlet process (HDP).[2] The algorithm has been implemented in the MALLET software package published by McCallum's group at the University of Massachusetts Amherst.

Model

PAM connects words in V and topics in T with an arbitrary directed acyclic graph (DAG), where topic nodes occupy the interior levels and the leaves are words.

The probability of generating a whole corpus is the product of the probabilities for every document:[3]

[math]\displaystyle{ P(\mathbf{D}|\alpha) = \prod_d P(d|\alpha) }[/math]

See also

References

  1. Blei, David. "Topic modeling". http://www.cs.princeton.edu/~blei/topicmodeling.html. Retrieved 4 October 2012. 
  2. 2.0 2.1 Li, Wei; Blei, David; McCallum, Andrew (2007). Nonparametric Bayes Pachinko Allocation. 
  3. 3.0 3.1 3.2 Li, Wei; McCallum, Andrew (2006). "Pachinko allocation: DAG-structured mixture models of topic correlations". Proceedings of the 23rd international conference on Machine learning - ICML '06. pp. 577–584. doi:10.1145/1143844.1143917. ISBN 1595933832. http://www.cs.umass.edu/~mccallum/papers/pam-icml06.pdf. 
  4. Mimno, David; Li, Wei; McCallum, Andrew (2007). "Mixtures of hierarchical topics with Pachinko allocation". Proceedings of the 24th international conference on Machine learning. pp. 633–640. doi:10.1145/1273496.1273576. ISBN 9781595937933. http://maroo.cs.umass.edu/pdf/IR-587.pdf. 
  5. Hofmann, Thomas (1999). "Probabilistic Latent Semantic Indexing". Proceedings of the Twenty-Second Annual International SIGIR Conference on Research and Development in Information Retrieval. Archived from the original on 2010-12-14. https://web.archive.org/web/20101214074049/http://www.cs.brown.edu/~th/papers/Hofmann-SIGIR99.pdf. 
  6. Blei, David M.; Ng, Andrew Y.; Jordan, Michael I; Lafferty, John (January 2003). "Latent Dirichlet allocation". Journal of Machine Learning Research 3: pp. 993–1022. http://jmlr.csail.mit.edu/papers/v3/blei03a.html. Retrieved 19 July 2010. 

External links




Licensed under CC BY-SA 3.0 | Source: https://handwiki.org/wiki/Pachinko_allocation
3 views | Status: cached on August 14 2024 14:00:41
↧ Download this article as ZWI file
Encyclosphere.org EncycloReader is supported by the EncyclosphereKSF