|
|
Unsupervised Feature Selection Based on Locality Preserving Projection and Sparse Representation |
JIAN Cai-Ren, CHEN Xiao-Yun |
College of Mathematics and Computer Science,Fuzhou University, Fuzhou 350116 |
|
|
Abstract Traditional filter-based feature selection methods calculate some scores of each feature independently to select features in a statistical or geometric perspective only, however, they ignore the correlation of different features. To solve this problem, an unsupervised feature selection method based on locality preserving projection and sparse representation is proposed. The nonnegativity and sparsity of feature weights are limited to select features in the proposed method. The experimental results on 4 gene expression datasets and 2 image datasets show that the method is effective.
|
Received: 19 December 2013
|
|
|
|
|
[1] Zhang X G. Pattern Recognition. 3rd Edition. Beijing, China: Tsinghua University Press, 2010 (in Chinese) (张学工.模式识别.第三版.北京:清华大学出版社, 2010) [2] Boutemedjet S, Bouguila N, Ziou D. A Hybrid Feature Extraction Selection Approach for High-Dimensional Non-Gaussian Data Clustering. IEEE Trans on Pattern Analysis and Machine Intelligence, 2009, 31(8): 1429-1443 [3] Boutsidis C, Mahoney M W, Drineas P. Unsupervised Feature Selection for Principal Components Analysis // Proc of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Las Vegas, USA, 2008: 61-69 [4] Cai D, He X F, Han J W. Sparse Projections over Graph // Proc of the 21st AAAI Conference on Artificial Intelligence. Chicago, USA, 2008: 610-615 [5] Xu Z L, King I, Lyu M R T, et al. Discriminative Semi-Supervised Feature Selection via Manifold Regularization. IEEE Trans on Neural Networks, 2010, 21(7): 1033-1047 [6] Chen H, Guo G D. Novel Dual Ensemble Based Semi-Supervised Feature Selection Method. Journal of Chinese Computer Systems, 2010, 31(8): 1604-1608(in Chinese) (陈 红,郭躬德.一种新的双重融合的半监督特征选择算法.小型微型计算机系统, 2010, 31(8): 1604-1608) [7] Tohti T, Patta A, Hamdulla A. Unsupervised Uyghur Segmentation and Unsupervised Feature Selection. Pattern Recognition and Artificial Intelligence, 2013, 26(9): 845-852 (in Chinese) (吐尔地·托合提,艾克白尔·帕塔尔,艾斯卡尔·艾木都拉.维吾尔文无监督自动切分及无监督特征选择.模式识别与人工智能, 2013, 26(9): 845-852) [8] Xu J L, Zhou Y M, Chen L, et al. An Unsupervised Feature Selection Approach Based on Mutual Information. Journal of Computer Research and Development, 2012, 49(2): 372-382 (in Chinese) (徐峻岭,周毓明,陈 林,等.基于互信息的无监督特征选择.计算机研究与发展, 2012, 49(2): 372-382) [9] Zhang L, Sun G, Guo J. Unsupervised Feature Selection Method Based on K-means Clustering. Application Research of Computers, 2005, 22(3): 23-24,42 (in Chinese) (张 莉,孙 钢,郭 军.基于K-均值聚类的无监督的特征选择方法.计算机应用研究, 2005, 22(3): 23-24,42) [10] Qin Q W, Liang J Y, Qian Y H. Clustering Feature Selection Method Based on Neighborhood Distance. Computer Science, 2012, 39(1): 175-177 (in Chinese) (秦奇伟,梁吉业,钱宇华.一种基于邻域距离的聚类特征选择方法.计算机科学, 2012, 39(1): 175-177) [11] He X F, Cai D, Niyogi P. Laplacian Score for Feature Selection // Proc of the Advances in Neural Information Processing Systems 18. Vancouver, Canada, 2005: 507-514 [12] Cai D, Zhang C Y, He X F. Unsupervised Feature Selection for Multi-cluster Data // Proc of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Washington, USA, 2010: 333-342 [13] Zhang Q, Lin Y Y, Yu G X. New Filter Method for Feature Selection Based on Graph. Computer Engineering and Applications, 2011, 47(26): 186-188 (in Chinese) (张 齐,林媛媛,余国先.一种基于图的特征选择方法.计算机工程与应用, 2011, 47(26): 186-188) [14] Pan F, Wang J D, Niu B. Unsupervised Feature Selection Approach Based on Spectral Analysis. Journal of Computer Applications, 2011, 31(8): 2108-2110,2114 (in Chinese) (潘 锋,王建东,牛 奔.基于谱分析的无监督特征选择算法.计算机应用, 2011, 31(8): 2108-2110,2114) [15] Li Z C, Yang Y, Liu J, et al. Unsupervised Feature Selection Using Nonnegative Spectral Analysis // Proc of the 26th AAAI Conference on Artificial Intelligence. Toronto, Canada, 2012: 1026-1032 [16] Zhu X F, Huang Z, Yang Y, et al. Self-Taught Dimensionality Reduction on the High-Dimensional Small-Sized Data. Pattern Recognition, 2013, 46(1): 215-229 [17] He X F, Niyogi P. Locality Preserving Projections // Proc of the Advances in Neural Information Processing Systems 16.Vancouver, Canada, 2003: 153-160 [18] Tibshirani R. Regression Shrinkage and Selection via the Lasso. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 1996: 58(1): 267-288 [19] Natarajan B K. Sparse Approximate Solutions to Linear Systems. SIAM Journal on Computing, 1995, 24(2): 227-234 [20] Kim S J, Koh K, Lustig M, et al. An Interior-Point Method for Large-Scale l1-Regularized Least Squares. IEEE Journal of Selected Topics in Signal Processing, 2007, 1(4): 606-617 [21] Yang J F, Zhang Y. Alternating Direction Algorithms for l1-Problems in Compressive Sensing. SIAM Journal on Scientific Computing, 2011, 33(1): 250-278 [22] Liu B, Fang B, Liu X W, et al. Large Margin Subspace Learning for Feature Selection. Pattern Recognition, 2013, 46(10): 2798-2806 [23] Khan J, Wei J S, Ringner M, et al. Classification and Diagnostic Prediction of Cancers Using Gene Expression Profiling and Artificial Neural Networks. Nature Medicine, 2001, 7(6): 673-679 [24] Bhattacharjee A, Richards W G, Staunton J, et al. Classification of Human Lung Carcinomas by mRNA Expression Profiling Reveals Distinct Adenocarcinoma Subclasses. Proceedings of the National Academy of Sciences, 2001, 98(24): 13790-13795 [25] Singh D, Febbo P G, Ross K, et al. Gene Expression Correlates of Clinical Prostate Cancer Behavior. Cancer Cell, 2002, 1(2): 203-209 [26] Shipp M A, Ross K N, Tamayo P, et al. Diffuse Large B-Cell Lymphoma Outcome Prediction by Gene-Expression Profiling and Supervised Machine Learning. Nature Medicine, 2002, 8(1): 68-74 [27] Cai D, He X F, Wu X Y, et al. Non-negative Matrix Factorization on Manifold // Proc of the 8th IEEE International Conference on Data Mining. Plsa, Italy, 2008: 63-72 |
|
|
|