Application to Topic Modeling of Time- Stamped Documents and apply that for Algorithms


Abstract in English

We have introduced a new applications for Dynamic Factor Graphs, consisting in topic modeling, text classification and information retrieval. DFGs are tailored here to sequences of time-stamped documents. Based on the auto-encoder architecture, our nonlinear multi-layer model is trained stage-wise to produce increasingly more compact representations of bags-ofwords at the document or paragraph level, thus performing a semantic analysis. It also incorporates simple temporal dynamics on the latent representations, to take advantage of the inherent (hierarchical) structure of sequences of documents, and can simultaneously perform a supervised classification or regression on document labels, which makes our approach unique. Learning this model is done by maximizing the joint likelihood of the encoding, decoding, dynamical and supervised modules, and is possible using an approximate and gradient-based maximum-a-posteriori inference. We demonstrate that by minimizing a weighted cross-entropy loss between his tograms of word occurrences and their reconstruction, we directly minimize the topic model perplexity, and show that our topic model obtains lower perplexity than the Latent Dirichlet Allocation on the NIPS and State of the Union datasets. We illustrate how the dynamical constraints help the learning while enabling to visualize the topic trajectory.

References used

Deerwester, S., Dumais, S., Furnas, G., Landauer, T. and Harshman, R.(1990). Indexing by latent semantic analysis. Journal of the American Society for Information Science 41, 391–407
Kolenda, T. and Kai Hansen, L. (2000). Independent components in text. In Advances in Independent Component Analysis
Gehler, P., Holub, A. and Welling, M. (2006). The rate adapting poisson model for information retrieval and object recognition. In ICML
Salakhutdinov, R. and Hinton, G. (2009). Replicated softmax. In ICML
Blei, D., Ng, A. and Jordan, M. (2003). Latent dirichlet allocation. Journal of Machine Learning Research 3, 993–1022

Download