|
|
Adaptive Rulkov Neuron Clustering Algorithm |
LIAO Yunrong1, REN Haipeng1 |
1. College of Armament Science and Technology, Xi'an Technological University, Xi'an 710021 |
|
|
Abstract Aiming at the clustering of sample datasets with small inter-class distance and poor separability, an adaptive Rulkov neuron clustering algorithm is proposed. Firstly, a similarity matrix based on adaptive distance and shared nearest neighbor is constructed. Secondly, the optimal segmentation of the undirected graph consisting of samples is replaced by the Laplace spectral decomposition of the matrix according to the similarity matrix, and the eigen vectors of Laplacian matrix with larger eigen values are selected as new features of the samples. Thus, the inter-class distance is increased and the intra-class spacing of the samples is reduced. Then, the samples are mapped to the neurons with the mutual coupling strength determined by the distance of the samples. The separability of the different clusters is improved by the self-learning of the mutual coupling strength. Finally, the strong coupled subset in the neural network is utilized as clustering result. The comparative experiments are conducted on synthetic and real datasets. The results show that the proposed algorithm achieves better clustering performance.
|
Received: 12 April 2021
|
|
Corresponding Authors:
REN Haipeng, Ph.D., professor. His research interests include complex system control and intelligent information processing.
|
About author:: LIAO Yunrong, master student. Her research interests include computer vision and hardware implementation of digital image processing algorithm. |
|
|
|
[1] LI J T, LIU Y H, HAO Y. The Improvement and Application of a K-means Clustering Algorithm // Proc of the IEEE International Conference on Cloud Computing and Big Data Analysis. Washington, USA: IEEE, 2016: 93-96. [2] YANG M M, HUANG Q X, GAO Y F, et al. Dynamic Equivalent Research Based on Fuzzy C-means // Proc of the IEEE Asia Power and Energy Engineering Conference. Washington, USA: IEEE, 2019: 156-160. [3] SUNGE A S, HERYADI Y, RELIGIA Y, et al. Comparison of Distance Function to Performance of K-medoids Algorithm for Clustering // Proc of the International Conference on Smart Technology and Applications. Washington, USA: IEEE, 2020. DOI: 10.1109/ICoSTA48221.2020.1570615793. [4] LI Y T, CAI J H, YANG H F, et al. A Novel Algorithm for Initial Cluster Center Selection. IEEE Access, 2019, 7: 74683-74693. [5] 张鑫涛,马福民,曹 杰,等.基于混合度量与类簇自适应调整的粗糙模糊K-means聚类算法.模式识别与人工智能, 2019, 32(12): 1141-1150. (ZHANG X T, MA F M, CAO J, et al. Rough Fuzzy K-means Clustering Algorithm Based on Mixed Metrics and Cluster Adaptive Adjustment. Pattern Recognition and Artificial Intelligence, 2019, 32(12): 1141-1150.) [6] ZHANG D D, LIU X Y, ZHAO Y Z. Improved K-medoids Clus-tering Algorithm for a Class of P Systems // Proc of the 10th International Conference on Information Technology in Medicine and Education. Washington, USA: IEEE, 2019: 674-678. [7] ZHANG Y R, DING S F, WANG L J, et al. Chameleon Algorithm Based on Mutual k-Nearest Neighbors. Applied Intelligence, 2020, 51: 2031-2044. [8] LI X L, LUO M. An Improved Wave Cluster Algorithm Based on ICA // Proc of the 5th International Conference on Wireless Communications, Networking and Mobile Computing. Washington, USA: IEEE, 2009. DOI: 10.1109/WICOM.2009.5302493. [9] FANG F, QIU L, YUAN S F. Adaptive Core Fusion-Based Density Peak Clustering for Complex Data with Arbitrary Shapes and Densities. Pattern Recognition, 2020, 107. DOI: 10.1016/j.patcog.2020.107452. [10] 王万良,吴 菲,吕 闯.自动确定聚类中心的快速搜索和发现密度峰值的聚类算法.模式识别与人工智能, 2019, 32(11): 1032-1041. (WANG W L, WU F, LÜ C. Automatic Determination of Clus-tering Center for Clustering by Fast Search and Find of Density Peaks. Pattern Recognition and Artificial Intelligence, 2019, 32(11): 1032-1041.) [11] LI H, LIU X J, LI T, et al. A Novel Density-Based Clustering Algorithm Using Nearest Neighbor Graph. Pattern Recognition, 2020, 102. DOI: 10.1016/j.patcog.2020.107206. [12] ALSHAMMARI M, STAVRAKAKIS J, TAKATSUKA M. Refining a k-nearest Neighbor Graph for a Computationally Efficient Spectral Clustering. Pattern Recognition, 2021, 114. DOI: 10.1016/j.patcog.2021.107869. [13] HESS S, DUIVESTEIJN W, HONYSZ P, et al. The SpectACl of Nonconvex Clustering: A Spectral Approach to Density-Based Clustering // Proc of the 33rd AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2019: 3788-3795. [14] VON LUXBURG U. A Tutorial on Spectral Clustering. Statistics and Computing, 2007, 17(4): 395-416. [15] NG A Y, JORDAN M I, WEISS Y. On Spectral Clustering: Ana-lysis and an Algorithm // Proc of the 14th International Conference on Neural Information Processing Systems: Natural and Syrithetic. Cambridge, USA: The MIT Press, 2002: 849-856. [16] RULKOV N F. Modeling of Spiking-Bursting Neural Behavior Using Two-Dimensional Map. Physical Review E, 2002, 65(4). DOI: 10.1103/PhysRevE.65.041922. [17] SHILNIKOV A L, RULKOV N F. Origin of Chaos in a Two-Dimensional Map Modeling Spiking-Bursting Neural Activity. International Journal of Bifurcation and Chaos, 2003, 13(11): 3325-3340. [18] HELD J, LORIMER T, ALBERT C, et al. Hebbian Learning Clustering with Rulkov Neurons // Proc of the 13th International Conference on Nonlinear Dynamics of Electronic Systems. Berlin, Germany: Springer, 2017: 127-141. [19] TAO W B, JIN H, ZHANG Y M. Color Image Segmentation Based on Mean Shift and Normalized Cuts. IEEE Transactions on Systems, Man, and Cybernetics(Cybernetics), 2007, 37(5): 1382-1389. [20] ZHANG L, WANG D, ZHANG X Q, et al. Simultaneous Learning of Affinity Matrix and Laplacian Regularized Least Squares for Semi-supervised Classification // Proc of the IEEE International Confe-rence on Image Processing. Washington, USA: IEEE, 2018: 1633-1637. [21] YIN G J. A Contour-Integral Based Method with Schur-Rayleigh-Ritz Procedure for Generalized Eigenvalue Problems. Journal of Scientific Computing, 2019, 81(1): 252-270. [22] WANG S, ZHANG L. A Supervised Correlation Coefficient Me-thod: Detection of Different Correlation // Proc of the 12th International Conference on Advanced Computational Intelligence. Washington, USA: IEEE, 2020: 408-411. [23] LOWE G. Concurrent Depth-First Search Algorithms Based on Tarjan's Algorithm. International Journal on Software Tools for Technology Transfer, 2016, 18(2): 129-147. [24] YUAN C Y, ZHANG L S. Spectral Averagely-Dense Clustering Based on Dynamic Shared Nearest Neighbors // Proc of the 5th International Conference on Computational Intelligence and Applications. Washington, USA: IEEE, 2020: 138-144. [25] SUN C, YUE S H, LI Q. Clustering Characteristics of UCI Dataset // Proc of the 39th Chinese Control Conference. Washington, USA: IEEE, 2020: 6301-6306. [26] KAYAL S. Face Verification Experiments on the LFW Database with Simple Features, Metrics and Classifiers // Proc of the 8th InternationalWorkshop on MultidimensionalSystems. Washington, USA: IEEE, 2013: 205-210. [27] YEH C C, YANG M S. Evaluation Measures for Cluster Ensembles Based on a Fuzzy Generalized Rand Index. Applied Soft Computing, 2017, 57: 225-234. [28] XIONG H, WU J J, CHEN J. K-means Clustering Versus Validation Measures: A Data-Distribution Perspective. IEEE Transactions on Systems, Man, and Cybernetics (Cybernetics), 2009, 39(2): 318-331. [29] ZELNIK-MANOR L, PERONA P. Self-tuning Spectral Clustering // Proc of the 17th International Conference on Neural Information Processing Systems. Cambridge, USA: The MIT Press, 2004: 1601-1608. [30] MANOCHANDAR S, PUNNIYAMOORTHY M, JEYACHITRA R K. Development of New Seed with Modified Validity Measures for k-means Clustering. Computers and Industrial Engineering, 2020, 141. DOI: 10.1016/j.cie.2020.106290. [31] FLORES K G, GARZA S E. Density Peaks Clustering with Gap-Based Automatic Center Detection. Knowledge-Based Systems, 2020, 206. DOI: 10.1016/j.knosys.2020.106350. [32] VAN DER MAATEN L, HINTON G. Visualizing Data Using t-SNE. Journal of Machine Learning Research, 2008, 9: 2579-2605. [33] XIE J Y, ZHOU Y, DING L J. Local Standard Deviation Spectral Clustering // Proc of the IEEE International Conference on Big Data and Smart Computing. Washington, USA: IEEE, 2018: 242-250. |
|
|
|