|
|
Feature Selection Method for Neuropsychiatric Disorder Based on Adaptive Sparse Structure Learning |
HAO Shijie1,2, GUO Yanrong1,2, CHEN Tao1,2, WANG Meng1,2, HONG Richang1,2 |
1. Key Laboratory of Knowledge Engineering with Big Data, Hefei University of Technology, Hefei 230601 2. School of Computer Science and Information Engineering, Hefei University of Technology, Hefei 230601 |
|
|
Abstract In the research of computer-aided diagnosis techniques for neuropsychiatric diseases, professionals are required to perform diagnostic-level semantic annotations on samples, and it is time-consuming and labor-intensive. Therefore, it is of great importance to develop unsupervised techniques for the computer-aided diagnosis on neuropsychiatric diseases. In this paper, an unsupervised feature selection method based on adaptive sparse structure learning is proposed and applied to the task of diagnosis on Schizophrenia and Alzheimer′s disease. The sparse representation and the data manifold structure are simultaneously learned in a unified framework. In this framework, the generalized norm is adopted to model the reconstruction error of sparse learning. The manifold structure of the whole dataset is iteratively updated. The lacking of robustness in the traditional feature selection methods is relieved. Experiments on two public datasets of Schizophrenia and Alzheimer′s disease demonstrate the effectiveness of the proposed method in classification of neuropsychiatric diseases.
|
Received: 26 July 2020
|
|
Fund:National Key R&D Program of China(No.2019YFA0706200), National Natural Science Foundation of China(No.62072152,61702156,61772171,61876056), Natural Science Foundation of Anhui Province(No.1808085QF188), Fundamental Research Funds for the Central Universities(No.PA2020 GDKC0023,PA2019GDZC0095). |
Corresponding Authors:
GUO Yanrong, Ph.D., associate professor. Her research interests include computer vision and medical image analysis.
|
About author:: HAO Shijie, Ph.D., associate professor. His research interests include pattern recognition, image processing and analysis. CHEN Tao, Ph.D. candidate. Her research interests include convolution neural network and multimodal fusion. WANG Meng, Ph.D., professor. His research interests include pattern recognition, data mining and multimedia information processing. HONG Richang, Ph.D., professor. His research interests include pattern recognition and multimedia Q&A. |
|
|
|
[1] BODAPATI A S, JENKINS L M, SHARMA R P, et al. Visual Memory Uniquely Predicts Anhedonia in Schizophrenia but not Bipolar Disorder. Journal of Neuropsychology, 2019, 13(1): 136-146. [2] LEE G, NHO K, KANG B, et al. Predicting Alzheimer's Disease Progression Using Multi-modal Deep Learning Approach. Scientific Reports, 2019, 9. DOI: 10.1038/s41598-018-37769-z. [3] FRANZMEIER N, RUBINSKI A, NEITZEL J, et al. Functional Connectivity Associated with TAU Levels in Ageing, Alzheimer's, and Small Vessel Disease. Brain, 2019, 142(4): 1093-1107. [4] CHOI J Y, LEE B. Combining of Multiple Deep Networks via Ensemble Generalization Loss, Based on MRI Images, for Alzheimer's Disease Classification. IEEE Signal Processing Letters, 2020, 27: 206-210. [5] LIU H, YU L. Toward Integrating Feature Selection Algorithm for Classification and Clustering. IEEE Transaction on Knowledge and Data Engineering, 2005, 17(4): 491-502. [6] LI X P, WANG Y D, RUIZ R. A Survey on Sparse Learning Models for Feature Selection. IEEE Transactions on Cybernetics, 2020. DOI: 10.1109/TCYB.2020.2982445. [7] 郑建炜,路 程,秦梦洁,等.联合特征选择和光滑表示的子空间聚类算法.模式识别与人工智能, 2018, 31(5): 409-418. (ZHENG J W, LU C, QIN M J, et al. Subspace Clustering via Joint Feature Selection and Smooth Representation. Pattern Recognition and Artificial Intelligence, 2018, 31(5): 409-418.) [8] LI J D, CHENG K W, WANG S H, et al. Feature Selection: A Data Perspective. ACM Computing Surveys, 2017, 50(6). DOI: 10.1145/3136625. [9] ALZUBI J A. Diversity-Based Boosting Algorithm. International Jour-nal of Advanced Computer and Applications, 2016, 7(5): 524-529. [10] ALZUBI J, NAYYAR A, KUMAR A. Machine Learning from Theory to Algorithms: An Overview // Proc of the 2nd International Conference on Computational Intelligence. Berlin, Germany: Sprin-ger, 2018. DOI: 10.1088/1742-6596/1142/1/012012. [11] YANG Y, SHEN H T, MA Z G, et al. l2,1-norm Regularized Discriminative Feature Selection for Unsupervised Learning // Proc of the 22nd International Joint Conference on Artificial Intelligence. New York, USA: ACM, 2011: 1589-1594. [12] HE X F, CAI D, NIYOGI P. Laplacian Score for Feature Selection // Proc of the 18th International Conference on Neural Information Processing Systems. Cambridge, USA: The MIT Press, 2005: 507-514. [13] AMBUSAIDI M A, HE X J, NANDA P, et al. Building an Intrusion Detection System Using a Filter-Based Feature Selection Algorithm. IEEE Transactions on Computers, 2016, 65(10): 2986-2998. [14] RODRIGUES D, PEREIRA L A M, NAKAMURA R Y M, et al. A Wrapper Approach for Feature Selection Based on Bat Algorithm and Optimum-Path Forest. Expert Systems with Applications, 2014, 41(5): 2250-2258. [15] WANG S H, TANG J L, LIU H, et al. Embedded Unsupervised Feature Selection // Proc of the 29th AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2015: 470-476. [16] ZHAO Z, LIU H. Spectral Feature Selection for Supervised and Unsupervised Learning // Proc of the 24th International Confe-rence on Machine Learning. New York, USA: ACM, 2007: 1151-1157. [17] CAI D, ZHANG C Y, HE X F. Unsupervised Feature Selection for Multi-cluster Data // Proc of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2010: 333-342. [18] ZHAO Z, WANG L, LIU H. Efficient Spectral Feature Selection with Minimum Redundancy // Proc of the 24th AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2010: 673-678. [19] HOU C P, NIE F P, LI X L, et al. Joint Embedding Learning and Sparse Regression: A Framework for Unsupervised Feature Selection. IEEE Transactions on Cybernetics, 2014, 44(6): 793-804. [20] NG A Y, JORDAN M I, WEISS Y. On Spectral Clustering: Ana-lysis and an Algorithm // Proc of the 14th International Conference on Neural Information Processing Systems. Cambridge, USA: The MIT Press, 2001: 849-856. [21] MAIRAL J, BACH F, PONCE J, et al. Online Learning for Matrix Factorization and Sparse Coding. Journal of Machine Learning Research, 2010, 11: 19-60. [22] LEI C, ZHU X F. Unsupervised Feature Selection via Local Structure Learning and Sparse Learning. Multimedia Tools and Applications, 2018, 77(22): 29605-29622. [23] PENG H Y, FAN Y. A General Framework for Sparsity Regula-rized Feature Selection via Iteratively Reweighted Least Square Minimization // Proc of the 31st AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2017: 2471-2477. [24] LI J D, WU L, DANI H, et al. Unsupervised Personalized Feature Selection // Proc of the 32nd AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2018: 3514-3521. [25] LI Z C, YANG Y, LIU J. Unsupervised Feature Selection Using Nonnegative Spectral Analysis // Proc of the 26th AAAI Confe-rence on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2012: 1026-1032. [26] ZHU X F, WU X D, DING W, et al. Feature Selection by Joint Graph Sparse Coding // Proc of the SIAM International Conference on Data Mining. Berlin, Germany: Springer, 2013: 803-811. [27] ZHU X F, LI X L, ZHANG S C, et al. Robust Joint Graph Sparse Coding for Unsupervised Spectral Feature Selection. IEEE Transactions on Neural Networks and Learning Systems, 2017, 28(6): 1263-1275. [28] LI X L, ZHANG H, ZHANG R, et al. Generalized Uncorrelated Regression with Adaptive Graph for Unsupervised Feature Selection. IEEE Transactions on Neural Networks and Learning Systems, 2019, 30(5): 1587-1595. [29] TANG C, ZHU X Z, CHEN J J, et al. Robust Graph Regularized Unsupervised Feature Selection. Expert Systems with Applications, 2018, 96: 64-76. |
|
|
|