A Classifier Ensemble Algorithm Based on Local Random Subspace
YANG Ming, WANG Fei
School of Computer Science and Technology,Nanjing Normal University,Nanjing 210046 Jiangsu Research Center of Information Security and Privacy Technology,Nanjing 210046
Abstract:Classifier ensemble learning is one of the present research focuses in machine learning field. However, the classical method of completely random subspace selecting can not guarantee good performances of sub-classifiers for high dimension datasets. Therefore, a classifier ensemble algorithm based on local random subspace is proposed. The features are ranked by employing feature selection strategy firstly, and then the ranked feature list is partitioned into a few parts and the randomly feature is selected in each part according to the given sampling rate. Thus, the performances of sub-classifiers and their diversities are improved. Experiments are carried out on 5 UCI datasets and 5 gene datasets. The experimental results show that the proposed algorithm is superior to a single classifier, and in most cases it is better than those classical classifier ensemble methods.
杨明,王飞. 一种基于局部随机子空间的分类集成算法[J]. 模式识别与人工智能, 2012, 25(4): 595-603.
YANG Ming, WANG Fei. A Classifier Ensemble Algorithm Based on Local Random Subspace. , 2012, 25(4): 595-603.
[1] Fukunaga K.Introduction of Statistical Pattern Recognition.2nd Edition.London,UK: Academic Press,1991 [2] Dietterich T G.Machine Learning Research: Four Current Directions.Artificial Intelligence.1997,18(4): 97-136 [3] Hansen L K,Salamon P.Neural Network Ensembles.IEEE Trans on Pattern Analysis and Machine Intelligence,1990,12(10): 993-1001 [4] Krogh A,Vedelsby J.Neural Network Ensembles,Cross Validation,and Active Learning.Cambridge,USA: MIT Press,1995 [5] Dietterich T G.An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging,Boosting and Randomization.Machine Learning,2000,40(2): 139-158 [6] Freund Y,Schapire R E.Experiments with a New Boosting Algorithm // Proc of the 13th International Conference on Machine Learning.New Brunswick,USA,1996: 148-156 [7] Breiman L.Bagging Predictors.Machine Learning,1996,24(2): 123-140 [8] Optiz D.Feature Selection for Ensembles // Proc of the 16th National Conference on Artificial Intelligence.Orlando,USA,1999: 379-384 [9] Dietterich T G,Bakiri G.Error-Correcting Output Codes: A General Method for Improving Multiclass Inductive Learning Programs // Proc of the 9th National Conference on Artificial Intelligence.Anaheim,USA,1991: 572-577 [10] Dietterich T G,Bakiri G.Solving Multiclass Learning Problems via Error Correction Output Codes.Journal of Artificial Intelligence Research,1995,2(1): 263-286 [11] Zhou Zhihua,Wu Jianxin,Tang Wei.Ensembling Neural Networks: Many Could Be Better Than All.Artificial Intelligence,2002,137(1/2): 239-263 [12] Li Nan,Zhou Zhihua.Selective Ensemble under Regularization Framework // Proc of the 8th International Workshop on Multiple Classifier Systems.Reykjavik,Iceland,2009: 293-303 [13] Tumer K,Ghosh J.Classifier Combining: Analytical Results and Implications // Proc of the AAAI Workshop on Integrating Multiple Learned Models for Improving and Scaling Machine Learning Algorithms.Portland,USA,1996: 126-132 [14] Ho T K.The Random Subspace Method for Constructing Decision Forests.IEEE Trans on Pattern Analysis and Machine Intelligence,1998,20(8): 832-844 [15] Wang Xiaogang,Tang Xiaoou.Using Random Subspace to Combine Multiple Features for Face Recognition // Proc of the 6th IEEE International Conference on Automatic Face and Gesture Recognition.Los Alamitos,USA,2004: 284-289 [16] Bay S D.Combining Nearest Neighbor Classifiers through Multiple Feature Subsets // Proc of the 17th International Conference on Machine Learning.Madison,USA,1998: 37-45 [17] Yang Ming.A Novel Algorithm for Attribute Reduction Based on Consistent Criterion.Chinese Journal of Computers,2010,33(2): 231-239 (in Chinese) (杨 明.一种基于一致性准则的属性约简算法.计算机学报,2010,33(2): 231-239) [18] Yang Ming,Yang Ping.A Novel Condensing Tree Structure for Rough Set Feature Selection.Neurocomputing,2008,71(4/5/6): 1092-1100 [19] Yang Ming.An Incremental Updating Algorithm for Attribute Reduction Based on Improved Discernibility Matrix.Chinese Journal of Computers,2007,30(5): 815-822 [20] Crammer K,Gilad-Bachrach R,Navot A,et al.Margin Analysis of the LVQ Algorithm // Proc of the 17th Conference on Neural Information Processing Systems.Sydney,Australia,2002: 492-496 [21] Gilad-Bachrach R,Navot A,Tishby N.Margin Based Feature Selection-Theory and Algorithms // Proc of the 21st International Conference on Machine Learning.Banff,Canada,2004: 43-50 [22] Vapnik V.The Nature of Statistical Learning Theory.New York,USA: Springer-Verlag,1995