[1] BLEI D M, NG A Y, JORDAN M I. Latent Dirichlet Allocation. Journal of Machine Learning Research, 2003, 3: 993-1022.
[2] BLEI D M, LAFFERTY J D. Correlated Topic Models[C/OL]. [2019-04-21]. http://people.ee.duke.edu/~lcarin/Blei2005CTM.pdf.
[3] BLEI D M, LAFFERTY J D. Dynamic Topic Models // Proc of the 23rd International Conference on Machine Learning. Berlin, Germany: Springer, 2006. DOI: 10.1145/1143844.1143859.
[4] CHEN J F, ZHU J, LU J, et al. Scalable Training of Hierarchical Topic Models. Proceedings of the VLDB Endowment, 2018, 11(7): 826-839.
[5] BLEI D M, GRIFFITHS T L, JORDAN M I, et al. Hierarchical Topic Models and the Nested Chinese Restaurant Process[C/OL]. [2019-04-21]. https://people.eecs.berkeley.edu/~jordan/papers/lda-crp.pdf.
[6] MIAO Y S, YU L, BLUNSOM P. Neural Variational Inference for Text Processing // Proc of the 23rd International Conference on Machine Learning. Berlin, Germany: Springer, 2016: 1727-1736.
[7] WEI X, CROFT W B. LDA-Based Document Models for AD-HOC Retrieval // Proc of the 29th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. New York, USA: ACM, 2006: 178-185.
[8] WANG C, BLEI D M. Collaborative Topic Modeling for Recommending Scientific Articles // Proc of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2011: 448-456.
[9] LIU S X, WANG X T, LIU J F, et al. TopicPanorama: A Full Picture of Relevant Topics // Proc of the IEEE Conference on Visual Analytics Science and Technology. Washington, USA: IEEE, 2014: 183-192.
[10] WANG Y, ZHAO X M, SUN Z L, et al. Towards Topic Modeling for Big Data[C/OL]. [2019-04-21]. https://arxiv.org/pdf/1405.4402v1.pdf.
[11] DONG Y P, SU H, ZHU J, et al. Improving Interpretability of Deep Neural Networks with Semantic Information // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2017: 4306-4314.
[12] MCAULIFFE J D, BLEI D M. Supervised Topic Models // PLATT J C, KOLLER D, SINGER Y, et al., eds. Advances in Neural Information Processing Systems 20. Cambridge, USA: The MIT Press, 2008: 121-128.
[13] LACOSTE-JULIEN S, SHA F, JORDAN M I. DiscLDA: Discriminative Learning for Dimensionality Reduction and Classification // KOLLER D, SCHUURMANS D, BENGIO Y, et al., eds. Advances in Neural Information Processing Systems 21. Cambridge, USA: The MIT Press, 2009: 897-904.
[14] ZHU J, AHMED A, XING E P. MedLDA: Maximum Margin Supervised Topic Models. Journal of Machine Learning Research, 2012, 13: 2237-2278.
[15] ZHU J, CHEN N, PERKINS H, et al. Gibbs Max-Margin Topic Models with Data Augmentation. Journal of Machine Learning Research, 2014, 15: 1073-1110.
[16] YAO L M, MIMNO D, MCCALLUM A. Efficient Methods for To-pic Model Inference on Streaming Document Collections // Proc of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2009: 937-946.
[17] LI A Q, AHMED A, RAVI S, et al. Reducing the Sampling Complexity of Topic Models // Proc of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2014: 891-900.
[18] YU H F, HSIEH C J, YUN H, et al. A Scalable Asynchronous Distributed Algorithm for Topic Modeling // Proc of the 24th International Conference on World Wide Web. New York, USA: ACM, 2015: 1340-1350.
[19] YUAN J H, GAO F, HO Q R, et al. LightLDA: Big Topic Models on Modest Computer Clusters // Proc of the 24th International Conference on World Wide Web. New York, USA: ACM, 2015: 1351-1361.
[20] CHEN J F, LI K W, ZHU J, et al. WarpLDA: A Cache Efficient O(1) Algorithm for Latent Dirichlet Allocation. Proceedings of the VLDB Endowment, 2016, 9(10): 744-755.
[21] JIANG Q X, ZHU J, SUN M S, et al. Monte Carlo Methods for Maximum Margin Supervised Topic Models // BENGIO S, WALLACH H M, LAROCHELLE H, et al., eds. Advances in Neural Information Processing Systems 25. Cambridge, USA: The MIT Press, 2012: 1592-1600.
[22] ZHU J, CHEN J F, HU W B, et al. Big Learning with Bayesian Methods. National Science Review, 2017, 4(4): 627-651.
[23] MEI S K, ZHU J, ZHU X J. Robust RegBayes: Selectively Incorporating First-Order Logic Domain Knowledge into Bayesian Models // Proc of the 31st International Conference on Machine Learning. Berlin, Germany: Springer, 2014: 253-261.
[24] KOYEJO O, GHOSH J. Constrained Bayesian Inference for Low Rank Multitask Learning // Proc of the 29th Conference on Uncertainty in Artificial Intelligence. Bellevue, USA: AUAI Press, 2013: 341-350.
[25] GRIFFITHS T L, STEYVERS M. Finding Scientific Topics. Proceedings of the National Academy of Sciences of the United States of America, 2004, 101(s1): 5228-5235.
[26] ZHENG X, YU Y L, XING E P. Linear Time Samplers for Supervised Topic Models Using Compositional Proposals // Proc of the 21st ACM SIGKDD International Conference on Knowledge Disco-very and Data Mining. New York, USA: ACM, 2015: 1523-1532.
[27] HSIEH C J, CHANG K W, LIN C J, et al. A Dual Coordinate De-
scent Method for Large-Scale Linear SVM // Proc of the International Conference on Machine Learning. Berlin, Germany: Sprin-ger, 2008: 408-415.
[28] SHALEV-SHWARTZ S, ZHANG T. Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization. Journal of Machine Learning Research, 2013, 14: 567-599.
[29] LEWIS D D, YANG Y M, ROSE T G, et al. RCV1: A New Benchmark Collection for Text Categorization Research. Journal of Machine Learning Research, 2004, 5: 361-397. |