|
|
Subspace Clustering via Joint Feature Selection and Smooth Representation |
ZHENG Jianwei, LU Cheng, QIN Mengjie, CHEN Wanjun |
College of Computer Science and Technology, Zhejiang University of Technology, Hangzhou 310014 |
|
|
Abstract The performance of self-representation based methods is affected by redundant high-dimensional features. Therefore, a subspace clustering method via joint feature selection and smooth representation(FSSR) is proposed in this paper. Firstly, the idea of feature selection is integrated into the self-representation based coefficient matrix learning framework. Meanwhile, a weight factor is adopted to measure different contributions of correlated features. Furthermore, a group effectiveness constraint is imposed on the coefficient matrix for the preservation of locality property. An alternating direction method of multipliers(ADMM) based algorithm is derived to optimize the proposed cost function. Experiments are conducted on synthetic data and standard databases and the results demonstrate that FSSR outperforms the state-of-the-art approaches in both accuracy and efficiency.
|
Received: 09 October 2017
|
|
Corresponding Authors:
CHEN Wanjun, master, lecturer. Her research interests include machine learning and pattern recognition.
|
About author:: ZHENG Jianwei, Ph.D., associate professor. His research interests include machine learning and pattern recognition. LU Cheng, master student. His research interests include machine learning and data mining. QIN Mengjie, master student. Her research interests include machine learning and pattern recognition. |
|
|
|
[1] AHN I, KIM C. Face and Hair Region Labeling Using Semi-supervised Spectral Clustering-Based Multiple Segmentations. IEEE Transactions on Multimedia, 2016, 18(7): 1414-1421. [2] LUO J J, JIAO L C, LOZANO J A. A Sparse Spectral Clustering Framework via Multiobjective Evolutionary Algorithm. IEEE Transactions on Evolutionary Computation, 2016, 20(3): 418-433. [3] VON LUXBURG U. A Tutorial on Spectral Clustering. Statistics and Computing, 2007, 17(4): 395-416. [4] PENG X, ZHANG L, YI Z. Scalable Sparse Subspace Clustering//Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2013: 430-437. [5] ELHAMIFAR E, VIDAL R. Sparse Subspace Clustering//Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2009: 2790-2797. [6] LIU G C, LIN Z C, YAN S C, et al. Robust Recovery of Subspace Structures by Low-Rank Representation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013, 35(1): 171-184. [7] HU H, LIN Z C, FENG J J, et al . Smooth Representation Clustering//Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2014: 3834-3841. [8] YIN M, GAO J B, LIN Z C. Laplacian Regularized Low-Rank Representation and Its Applications. IEEE Transaction on Pattern Analysis and Machine Intelligence, 2016, 38(3): 504-517. [9] NIE F P, HUANG H. Subspace Clustering via New Low-Rank Model with Discrete Group Structure Constraint//Proc of the 25th International Joint Conference on Artificial Intelligence. Washington, USA: IEEE, 2016: 1874-1880. [10] GUO X J. Robust Subspace Segmentation by Simultaneously Learning Data Representations and Their Affinity Matrix//Proc of the 24th International Joint Conference on Artificial Intelligence. Washington, USA: IEEE, 2015: 3547-3553. [11] CAI D, ZHANG C Y, HE X F. Unsupervised Feature Selection for Multi-cluster Data//Proc of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2010: 333-342. [12] ZHU P F, ZHU W C, HU Q H, et al . Subspace Clustering Guided Unsupervised Feature Selection. Pattern Recognition, 2017, 66: 364-374. [13] WANG W Z, ZHANG H Z, ZHU P F, et al. Non-convex Regularized Self-representation for Unsupervised Feature Selection//Proc of the 5th International Conference on Intelligence Science and Big Data Engineering. New York, USA: ACM, 2015: 55-65. [14] YAN H, YANG J. Locality Preserving Score for Joint Feature Weights Learning. Neural Networks, 2015, 69: 126-134. [15] PENG C, KANG Z, YANG M, et al . Feature Selection Embedded -Subspace Clustering. IEEE Signal Processing Letters, 2016, 23(7):1018-1022. [16] 王卫卫,李小平,冯象初,等.稀疏子空间聚类综述.自动化学报, 2015, 41(8): 1373-1384. (WANG W W, LI X P, FENG X C, et al . A Survey on Sparse Subspace Clustering. Acta Automatica Sinica, 2015, 41(8): 1373-1384.) [17] YONG H W, MENG D Y, ZUO W M, et al . Robust Online Matrix Factorization for Dynamic Background Subtraction. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017. DOI: 10.1109/TPAMI.2017.2732350. [18] 严 菲,王晓栋.基于局部判别约束的半监督特征选择方法.模式识别与人工智能, 2017, 30(1): 89-95. (YAN F, WANG X D. A Semi-supervised Feature Selection Method Based on Local Discriminant Constraint. Pattern Recognition and Artificial Intelligence, 2017, 30(1): 89-95.) [19] CHEN Y, CAO X Y, ZHAO Q, et al . Denoising Hyperspectral Image with Non-i.i.d. Noise Structure. IEEE Transactions on Cybernetics, 2017. DOI: 10.1109/TCYE.2017.2677944. [20] LU C Y, MIN H, ZHAO Z Q, et al . Robust and Efficient Subspace Segmentation via Least Squares Regression//Proc of the 12th European Conference on Computer Vision. Berlin, Germany: Springer, 2012: 347-360. [21] LU C Y, FENG J S, LIN Z C, et al . Correlation Adaptive Subspace Segmentation by Trace Lasso//Proc of the IEEE International Conference on Computer Vision. Washington, USA: IEEE, 2013: 1345-1352. [22] XU Y, ZHONG A N, YANG J, et al . LPP Solution Schemes for Use with Face Recognition. Pattern Recognition, 2010, 43: 4165-4176. [23] PENG C, KANG Z, CHENG Q. Subspace Clustering via Variance Regularized Ridge Regression//Proc of the 30th IEEE Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2017: 682-691. [24] HUANG C L, WANG C J. A GA-Based Feature Selection and Parameters Optimization for Support Vector Machines. Expert Systems with Applications, 2006, 31(2): 231-240. [25] ZHAO Z, HE X F, CAI D, et al . Graph Regularized Feature Selection with Data Reconstruction. IEEE Transactions on Knowledge and Data Engineering, 2016, 28(3): 689-700. |
|
|
|