Nonparametric Image Clustering Based on Variational Bayesian Contrastive Network
ZHANG Shengjie1, WANG Yifei1, XIANG Wang1, XUE Dizhan2, QIAN Shengsheng2
1. Henan Institute of Advanced Technology, Zhengzhou University, Zhengzhou 450003; 2. State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Sciences, Beijing 100190
Abstract:The number of clusters in nonparametric image clustering is unknown and it needs to be discovered by the model automatically. Although some existing Bayesian methods can automatically infer the number of clusters, they are not feasible on large-scale image datasets due to the high computational costs or over-reliance on learned features. Therefore, nonparametric image clustering based on variational Bayesian contrastive network is proposed in this paper. Firstly, image features are extracted by ResNet. Secondly, deep variational Dirichlet process mixture is put forward to automatically infer the number of clusters, and it can be directly embedded into end-to-end deep models and jointly optimized with feature extractors. Finally, polarized contrast clustering learning is presented, and the denoising strategy with polarized label is utilized to denoise and polarize the labels. The polarized labels and data augmented predicted labels are employed for comparative learning to jointly optimize image feature extractors and clustering model. Experiments on three benchmark datasets show that the performance of the proposed method is superior.
[1] CHANG J L, WANG L F, MENG G F, et al. Deep Adaptive Image Clustering // Proc of the IEEE International Conference on Compu-ter Vision. Washington, USA: IEEE, 2017: 5880-5888. [2] JI X, HENRIQUES J F, VEDALDI A.Invariant Information Clus-tering for Unsupervised Image Classification and Segmentation // Proc of the IEEE/CVF International Conference on Computer Vision. Washington, USA: IEEE, 2019: 9865-9874. [3] WANG J Y, ZHU X T, GONG S G, et al. Attribute Recognition by Joint Recurrent Learning of Context and Correlation // Proc of the IEEE International Conference on Computer Vision. Washington, USA: IEEE, 2017: 531-540. [4] WU J L, LONG K Y, WANG F, et al. Deep Comprehensive Correlation Mining for Image Clustering // Proc of the IEEE/CVF International Conference on Computer Vision. Washington, USA: IEEE, 2019: 8149-8158. [5] LECUN Y, BENGIO Y, HINTON G. Deep Learning. Nature, 2015, 521(7553): 436-444. [6] HUANG J B, GONG S G, ZHU X T.Deep Semantic Clustering by Partition Confidence Maximisation // Proc of the IEEE/CVF Confe-rence on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2020: 8846-8855. [7] LI Y F, HU P, LIU Z T, et al. Contrastive Clustering. Proceedings of the AAAI Conference on Artificial Intelligence, 2021, 35(10): 8547-8555. [8] ZHONG H S, WU J L, CHEN C, et al. Graph Contrastive Clus-tering // Proc of the IEEE/CVF International Conference on Computer Vision. Washington, USA: IEEE, 2021: 9204-9213. [9] CARREIRA-PERPINAN M A. Fast Nonparametric Clustering with Gaussian Blurring Mean-Shift // Proc of the 23rd International Conference on Machine Learning. Palo Alto, USA: AAAI, 2006: 153-160. [10] NGUYEN T V, PHUNG D, NGUYEN X, et al. Bayesian Nonparametric Multilevel Clustering with Group-Level Contexts // Proc of the 31st International Conference on Machine Learning. Palo Alto, USA: AAAI, 2014: 288-296. [11] ANTONIAK C E.Mixtures of Dirichlet Processes with Applications to Bayesian Nonparametric Problems. The Annals of Statistics, 1974, 2(6): 1152-1174. [12] MACEACHERN S N.Estimating Normal Means with a Conjugate Style Dirichlet Process Prior. Communications in Statistics-Simulation and Computation, 1994, 23(3): 727-741. [13] ESCOBAR M D, WEST M.Bayesian Density Estimation and Infe-rence Using Mixtures. Journal of the American Statistical Association, 1995, 90(430): 577-588. [14] BLEI D M, JORDAN M I.Variational Methods for the Dirichlet Process // Proc of the 21st International Conference on Machine Learning. Palo Alto, USA: AAAI, 2004. DOI: 10.1145/1015330.1015439. [15] BLEI D M, JOURDAN M I.Variational Inference for Dirichlet Pro-cess Mixtures. Bayesian Analysis, 2006, 1(1): 121-143. [16] WANG Z Y, NI Y, JING B Y, et al. DNB: A Joint Learning Framework for Deep Bayesian Nonparametric Clustering. IEEE Transactions on Neural Networks and Learning Systems, 2022, 33(12): 7610-7620. [17] DENG J, DONG W, SOCHER R, et al. ImageNet: A Large-Scale Hierarchical Image Database // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2009: 248-255. [18] KRIZHEVSKY A.Learning Multiple Layers of Features from Tiny Images. Master Dissertation. Toronto, Canada: University of Toronto, 2009. [19] RONEN M, FINDER S E, FREIFELD O.DeepDPM: Deep Clustering with an Unknown Number of Clusters // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2022: 9851-9860. [20] ZHONG H S, CHEN C, JIN Z M, et al. Deep Robust Clustering by Contrastive Learning[C/OL].[2023-06-11]. https://arxiv.org/pdf/2008.03030.pdf. [21] XU J, TANG H Y, REN Y Z, et al. Multi-level Feature Learning for Contrastive Multi-view Clustering // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2022: 16030-16039. [22] CHEN T, KORNBLITH S, NOROUZI M, et al. A Simple Framework for Contrastive Learning of Visual Representations // Proc of the 37th International Conference on Machine Learning. San Diego, USA: JMLR, 2020: 1597-1607. [23] CHEN X L, FAN H Q, GIRSHICK R, et al. Improved Baselines with Momentum Contrastive Learning[C/OL].[2023-06-11]. https://arxiv.org/abs/2003.04297. [24] HE K M, ZHANG X Y, REN S Q, et al. Deep Residual Learning for Image Recognition // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2016: 770-778. [25] GÖRÜR D, RASMUSSEN C E. Dirichlet Process Gaussian Mixture Models: Choice of the Base Distribution. Journal of Computer Science and Technology, 2010, 25(4): 353-664. [26] MCQUEEN J B.Some Methods of Classification and Analysis of Multivariate Observations. Proceedings of the Berkeley Symposium on Mathematical Statistics and Probability, 1967, 5(1): 281-297. [27] COATES A, NG A Y, LEE H.An Analysis of Single-Layer Networks in Unsupervised Feature Learning // Proc of the 14th International Conference on Artificial Intelligence and Statistics. Palo Alto, USA: AAAI, 2011: 215-223. [28] KRIZHEVSKY A, SUTSKEVER I, HINTON G E.ImageNet Cla-ssification with Deep Convolutional Neural Networks. Communications of the ACM, 2017, 60(6): 84-90. [29] LE Y, YANG X. Tiny ImageNet Visual Recognition Challenge[C/OL]. [2023-06-11]. http://cs231n.stanford.edu/reports/2015/pdfs/yle_project.pdf. [30] KINGMA D P, BA J. Adam: A Method for Stochastic Optimization[C/OL]. [2023-06-11]. https://arxiv.org/pdf/1412.6980.pdf. [31] CHANG J L, MENG G F, WANG L F, et al. Deep Self-Evolution Clustering. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020, 42(4): 809-823. [32] TAO Y L, TAKAGI K, NAKATA K.Clustering-Friendly Representation Learning via Instance Discrimination and Feature Decorrelation[C/OL]. [2023-06-11].https://arxiv.org/abs/2106.00131. [33] CAI J Y, FAN J C, GUO W Z, et al. Efficient Deep Embedded Subspace Clustering // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2022: 21-30. [34] ESTER M, KRIEGEL H P, SANDER J, et al. A Density-Based Algorithm for Discovering Clusters in Large Spatial Databases with Noise // Proc of the 2nd International Conference on Knowledge Discovery and Data Mining. Palo Alto, USA: AAAI, 1996: 226-231. [35] VAN DER MAATEN L, HINTON G. Visualizing Data Using t-SNE. Journal of Machine Learning Research, 2008, 9: 2579-2605.