|
|
Integration of Random Subspace and Min-Max Modular SVM |
YU Yi, WU Jiao-Gao, LI Yun |
School of Computer Science and Technology, Nanjing University of Posts and Telecommunications, Nanjing 210023 |
|
|
Abstract The min-max modular support vector machine (M3-SVM) is a powerful tool for dealing with large-scale data. To improve the classification performance of M3-SVM for unblanced data with high dimension, several random subspace strategies are analyzed and combined with M3-SVM to reduce the dimensionality and add the ensemble mechanism on feature level. Thus, the min-max modular support vector machine with random subspace is proposed. The experimental results on real-world datasets including unbalanced data indicate that the proposed random subspace strategy enhances the classification of M3-SVM. Moreover, the diversity between sub-modules (base learner) in M3-SVM is discussed.
|
Received: 13 May 2013
|
|
|
|
|
[1] Han J, Kamber M. Data Mining: Concepts and Techniques. San Francisco, USA: Morgan Kaufman, 2000 [2] Dietterich T. Ensemble Methods in Machine Learning // Proc of the 1st International Workshop on Multiple Classifier Systems. Cagliari, Italy, 2000: 1-15 [3] Lü B L, Ito M. Task Decomposition and Module Combination Based on Class Relations: A Modular Neural Network for Pattern Classification. IEEE Trans on Neural Networks, 1999, 10(5): 1244-1256 [4] Lü B L, Wang K A, Utiyama M, et al. A Part-versus-Part Method for Massively Parallel Training of Support Vector Machines // Proc of the International Joint Conference on Neural Networks. Budapest, Hungary, 2004: 735-740 [5] Lü B L, Wang X L. A Parallel and Modular Pattern Classification Framework for Large-Scale Problems // Chen C H, ed.Handbook of Pattern Recognition and Computer Vision. 4th Edition.New Jersey, USA: World Scientific Publishing, 2009: 725-746 [6] Lü B L, Ma Q, Ichikawa M, et al. Efficient Part-of-Speech Tagging with a Min-Max Modular Neural Network. Applied Intelligence, 2003, 19(1/2): 65-81 [7] Lü B L, Shin J, Ichikawa M. Massively Parallel Classification of Single-Trial EEG Signals Using a Min-Max Modular Neural Network. IEEE Trans on Biomedical Engineering, 2004, 51(3): 551-558 [8] Chu X L, Ma C, Li J, et al. Large-Scale Patent Classification with Min-Max Modular Support Vector Machines // Proc of the International Joint Conference on Neural Networks. Hong Kong, China, 2008, I: 3972-3979 [9] Wu K, Lü B L, Uchiyama M, et al. An Empirical Comparison of Min-Max-Modular k-NN with Different Voting Methods to Large-Scale Text Categorization. Soft Computing, 2008, 12(7): 647-655 [10] Chen Y Y, Lü B L, Zhao H. Parallel Learning of Large-Scale Multi-label Classification Problems with Min-Max Modular LIBLINEAR // Proc of the International Conference on Neural Networks. Brisbane, Australia, 2012, I: 1-7 [11] Yang Y, Lü B L. Protein Subcellular Multi-localization Prediction Using a Min-Max Modular Support Vector Machine. International Journal of Neural Systems, 2010, 20(1): 13-28 [12] Guyon I, Elisseeff A. An Introduction to Variable and Feature Selection. Journal of Machine Learning Research, 2003, 3(9): 1157-1182 [13] Liu H, Yu L. Toward Integrating Feature Selection Algorithms for Classification and Clustering. IEEE Trans on Knowledge and Data Engineering, 2005, 17(4): 494-502 [14] Sun Z H, Bebis G, Miller R. Object Detection Using Feature Subset Selection. Pattern Recognition, 2004, 37(11): 2165-2176 [15] Jain A K, Duin R P W, Mao J C. Statistical Pattern Recognition: A Review. IEEE Trans on Pattern Analysis and Machine Intelligence, 2000, 22(1): 4-37 [16] Kudo M, Sklansky J. Comparison of Algorithms That Select Features for Pattern Classifiers. Pattern Recognition, 2000, 33(1): 25-41 [17] Zhao Z. Spectral Feature Selection for Mining Ultrahigh Dimensional Data. Ph.D Dissertation. Arizona, USA: Arizona State University, 2010 [18] Ho T K. The Random Subspace Method for Constructing Decision Forests. IEEE Trans on Pattern Analysis and Machine Intelligence, 1998, 20(8): 832-844 [19] Wang K L, Zhao H, Lü B L. Task Decomposition Using Geometric Relation for Min-Max Modular SVMs // Proc of the International Symposium on Neural Network.Chongqing, China, 2005: 887-892 [20] Yang M, Bao J, Ji G L. Semi-Random Subspace Sampling for Classification // Proc of the 6th International Conference on Natural Computation. Yantai, China, 2010: 3420-3424 [21] Platt J C. Fast Training of Support Vector Machines Using Sequential Minimal Optimization // Sch lkopf B, Burges C J C, Smola A J, eds. Advances in Kernel Methods-Support Vector Learning: Support Vector Learning. Cambridge, USA: MIT Press, 1998: 41-65 [22] Ye Z F, Wen Y M, Lü B L. A Survey of Imbalanced Pattern Classification Problems. CAAI Trans on Intelligent Systems, 2009, 4(2): 148-156 (in Chinese) (叶志飞,文益民,吕宝粮.不平衡分类问题研究综述.智能系统学报, 2009, 4(2): 148-156) [23] Kuncheva L I, Whitaker C J. Measure of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy. Machine Learning, 2003, 51(2): 181-207 [24] Dietterich T G. Machine Learning Research: Four Current Directions. Artificial Intelligence Magazine, 1997, 18(4): 97-136 [25] Tang E K, Suganthan P N, Yao X. An Analysis of Diversity Measures. Machine Learning, 2006, 65(1): 247-271 [26] Guyon I, Wenton J, Barnhill S. Gene Selection for Cancer Classification Using Support Vector Machines. Machine Learning, 2002, 46(1/2/3): 389-422 [27] Li Y, Feng L L. Integrating Feature Selection and Min-Max Modular SVM for Powerful Ensemble [EB/OL]. [2013-04-10]. http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=06252481 |
|
|
|