|
|
Deep Multi-network Embedded Clustering |
CHEN Rui1,2,3, TANG Yongqiang2, ZHANG Caixia1,3, ZHANG Wensheng2, HAO Zhifeng1 |
1. School of Mechatronic Engineering and Automation, Foshan University, Foshan 528225 2. Research Center of Precision Sensing and Control, Institute of Automation, Chinese Academy of Sciences, Beijing 100190 3. Guangdong Province Smart City Infrastructure Health Monitoring and Evaluation Engineering Technology Research Center, Guangdong Engineering Technology Research Center, Foshan 528000 |
|
|
Abstract Existing deep unsupervised clustering methods cannot make full use of the complementary information between the extracted features of different network structures due to the single network structure in them, and thus the clustering performance is restricted. A deep multi-network embedded clustering(DMNEC) algorithm is proposed to solve this problem. Firstly, the initialization parameters of each network are obtained by pretraining multi-network branches in an end-to-end manner. On this basis, the multi-network soft assignment is defined, then the clustering-oriented Kullback-Leibler divergence loss is established with the help of the multi-network auxiliary target distribution. The decoding network in the pretraining stage is finetuned via reconstruction loss to preserve the local structure and avoid the distortion of feature space. The weighted sum of reconstruction loss and clustering loss is optimized by stochastic gradient descent(SGD) and back propagation(BP) to jointly learn multi-network representation and cluster assignment. Experiments on four public image datasets demonstrate the superiority of the proposed algorithm.
|
Received: 02 September 2020
|
|
Fund:Youth Program of National Natural Science Foundation of China(No.61803087), Feature Innovation Project of Guangdong Province Department of Education(No.2019KTSCX192), Guangdong-Hong Kong-Macao Applied Mathematics Center Project of Guangdong Basic and Applied Basic Research Fund(No.2020B1515310003), Foshan Core Technology Research Project(No.1920001001367) |
Corresponding Authors:
ZHANG Caixia, Ph.D., professor. Her research interests include artificial intelligence, machine learning and data mining.
|
About author:: CHEN Rui, master student. His research interests include machine learning, data mi-ning and computer vision. TANG Yongqiang, Ph.D., assistant professor. His research interests include machine learning, data mining and computer vision. ZHANG Wensheng, Ph.D., professor. His research interests include artificial intelligence, machine learning and data mining. HAO Zhifeng, Ph.D., professor. His research interests include artificial intelligence, machine learning and data mining. |
|
|
|
[1] 孙吉贵,刘 杰,赵连宇.聚类算法研究.软件学报, 2008, 19(1): 48-61. (SUN J G, LIU J, ZHAO L Y. Clustering Algorithms Research. Journal of Software, 2008, 19(1): 48-61.) [2] SIMONYAN K, ZISSERMAN A. Very Deep Convolutional Networks for Large-Scale Image Recognition [C/OL]. [2020-08-24]. https://arxiv.org/pdf/1409.1556.pdf. [3] SZEGEDY C, LIU W, JIA Y Q, et al. Going Deeper with Convolutions // Proc of the IEEE Conference on Computer Vision and Pa-ttern Recognition. Washington, USA: IEEE, 2015. DOI: 10.1109/CVPR.2015.7298594. [4] HE K M, ZHANG X Y, REN S Q, et al. Deep Residual Learning for Image Recognition // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2016: 770-778. [5] VAN DER MAATEN L, HINTON G. Visualizing Data Using t-SNE. Journal of Machine Learning Research, 2008, 9: 2579-2605. [6] MACQUEEN J. Some Methods for Classification and Analysis of Multivariate Observations // Proc of the 5th Berkeley Symposium on Mathematical Statistics and Probability. Berkeley, USA: University of California Press, 1967: 281-297. [7] BISHOP C. Pattern Recognition and Machine Learning. Berlin, Ger-many: Springer, 2006. [8] WOLD S, ESBENSEN K, GELADI P. Principal Component Analysis. Chemometrics and Intelligent Laboratory Systems, 1987, 2: 37-52. [9] XU W, LIU X, GONG Y H. Document Clustering Based on Non-negative Matrix Factorization // Proc of the 26th Annual Internatio-nal ACM SIGIR Conference on Research and Development in Information Retrieval. New York, USA: ACM, 2003: 267-273. [10] COX M A A, COX T F. Multidimensional Scaling // CHEN C H, HÄRDLE W, UNWIN A, eds. Hand Book of Data Visualization. Berlin, Germany: Springer, 2008: 315-347. [11] 贾文娟,张煜东.自编码器理论与方法综述.计算机系统应用, 2018, 27(5): 1-9. (JIA W J, ZHANG Y D. Survey on Theories and Methods of Autoencoder. Computer Systems and Applications, 2018, 27(5): 1-9.) [12] YANG J W, PARIKH D, BATRA D. Joint Unsupervised Learning of Deep Representations and Image Clusters // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2016: 5147-5156. [13] 周飞燕,金林鹏,董 军.卷积神经网络研究综述.计算机学报, 2017, 40(6): 1229-1251. (ZHOU F Y, JIN L P, DONG J. Review of Convolutional Neural Network. Chinese Journal of Computers, 2017, 40(6): 1229-1251.) [14] XIE J Y, GIRSHICK R, FARHADI A. Unsupervised Deep Embedding for Clustering Analysis // Proc of the 33rd International Conference on Machine Learning. New York, USA: ACM, 2016: 478-487. [15] GUO X F, GAO L, LIU X W, et al. Improved Deep Embedded Clustering with Local Structure Preservation // Proc of the 26th International Joint Conference on Artificial Intelligence. San Francisco, USA: Morgan Kaufmann, 2017: 1753-1759. [16] LI F F, QIAO H, ZHANG B, et al. Discriminatively Boosted Image Clustering with Fully Convolutional Auto-Encoders. Pattern Recognition, 2018, 83: 161-173. [17] YANG B, FU X, SIDIROPOULOS N, et al. Towards k-means-Friendly Spaces: Simultaneous Deep Learning and Clustering // Proc of the 34th International Conference on Machine Learning. New York, USA: ACM, 2017: 3861-3870. [18] FARD M M, THONET T, GAUSSIER E. Deep k-means: Jointly Clustering with k-means and Learning Representations. Pattern Recognition Letters, 2020, 138: 185-192. [19] GUO X F, LIU X W, ZHU E, et al. Adaptive Self-paced Deep Clustering with Data Augmentation. IEEE Transactions on Know-ledge and Data Engineering, 2020, 32(9): 1680-1693. [20] PENG X, XIAO S J, FENG J S, et al. Deep Subspace Clustering with Sparsity Prior // Proc of the 25th International Joint Confe-rence on Artificial Intelligence. San Francisco, USA: Morgan Kaufmann, 2016: 1925-1931. [21] CHANG J L, WANG L F, MENG G F, et al. Deep Adaptive Image Clustering // Proc of the IEEE International Conference on Computer Vision. Washington, USA: IEEE, 2017: 5880-5888. [22] BO D Y, WANG X, SHI C, et al. Structural Deep Clustering Network // Proc of the International World Wide Web Conference. New York, USA: ACM, 2020: 1400-1410. [23] ALJALBOUT E, GOLKOV V, SIDDIQUI Y, et al. Clustering with Deep Learning: Taxonomy and New Methods[C/OL]. [2020-08-24]. https://arxiv.org/pdf/1801.07648.pdf. [24] MIN E X, GUO X F, LIU Q, et al. A Survey of Clustering with Deep Learning: From the Perspective of Network Architecture. IEEE Access, 2018, 6: 39501-39514. [25] LECUN Y, BOTTOU L, BENGIO Y, et al. Gradient-Based Lear-ning Applied to Document Recognition. Proceedings of the IEEE, 1998, 86(11): 2278-2324. [26] HULL J J. A Database for Handwritten Text Recognition Research. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1994, 16(5): 550-554. [27] HAN X, RASUL K, VOLLGRAF R. Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms[C/OL]. [2020-08-24]. https://arxiv.org/pdf/1708.07747.pdf. [28] NENE S A, NAYAR S K, MURASE H. Columbia Object Image Library(COIL-20). Technical Report, CUCS-006-96. New York, USA: Columbia University, 1996. [29] LI T, DING C. The Relationships among Various Nonnegative Matrix Factorization Methods for Clustering // Proc of the 6th IEEE International Conference on Data Mining. Washington, USA: IEEE, 2006. DOI: 10.1109/ICDM.2006.160. [30] STREHL A, GHOSH J. Cluster Ensembles-A Knowledge Reuse Framework for Combining Multiple Partitions. Journal of Machine Learning Research, 2002, 3: 583-617. [31] HUBERT L, ARABIE P. Comparing Partitions. Journal of Classification, 1985, 2(1): 193-218. [32] KUHN H W. The Hungarian Method for the Assignment Problem. Naval Research Logistics Quarterly, 1955, 2(1): 83-97. [33] RAND W M. Objective Criteria for the Evaluation of Clustering Methods. Journal of the American Statistical Association, 1971, 66(336): 846-850. [34] SHI J B, MALIK J. Normalized Cuts and Image Segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(8): 888-905. [35] KAMPFFMEYER M, LOKSE S, BIANCHI F M, et al. Deep Divergence-Based Approach to Clustering. Neural Networks, 2019, 113: 91-101. [36] TROSTEN D J, KAMPFFMEYER M C, JENSSEN R. Deep Image Clustering with Tensor Kernels and Unsupervised Companion Objectives[C/OL]. [2020-08-24]. https://arxiv.org/pdf/2001.07026.pdf. [37] CAI J Y, WANG S P, GUO W Z. Stacked Sparse Auto-Encoder for Deep Clustering // Proc of the IEEE International Conference on Parallel and Distributed Processing with Applications , Big Data and Cloud Computing, Sustainable Computing and Communications, Social Computing and Networking. Washington, USA: IEEE, 2019: 1532-1538. |
|
|
|