|
|
Pairwise Diversity Measures Based Selective Ensemble Method |
YANG Chang-Sheng,TAO Liang,CAO Zhen-Tian,WANG Shi-Yi |
Key Laboratory of Intelligent Computing and Signal Processing of Ministry of Education,Anhui University,Hefei 230039 |
|
|
Abstract Effective generating individual learners with strong generalization ability and great diversity is the key issue of ensemble learning. To improve diversity and accuracy of learners, Pairwise Diversity Measures based Selective Ensemble (PDMSEN) is proposed in this paper. Furthermore, an improved method is studied to advance the speed of the algorithm and support parallel computing. Finally, through applying BP neural networks as base learners, the experiment is carried out on selected UCI database and the improved algorithm is compared with Bagging and GASEN (Genetic Algorithm based Selected Ensemble) algorithms. Experimental results demonstrate that the learning speed of the proposed algorithm is superior to that of the GASEN algorithm with the same learning performance.
|
Received: 22 April 2009
|
|
|
|
|
[1] Krogh A, Vedelsby J. Neural Network Ensembles, Cross Validation, and Active Learning // Tesauro G, Touretzky D S, Leen T K, eds. Advances in Neural Information Processing Systems. Cambridge, USA: MIT Press, 1995, Ⅶ: 231-238 [2] Zhou Zhihua, Wu Jianxin, Tang Wei. Ensembling Neural Networks: Many Could Be Better Than All. Artificial Intelligence, 2002, 137(1/2): 239-263 [3] Castro P D, Coelho G P, Caetano M F, et al. Designing Ensembles of Fuzzy Classification Systems: An Immune-Inspired Approach // Proc of the 4th International Conference on Artificial Immune Systems. Banff, Canada, 2005: 469-482 [4] Li Kai, Huang Houkuan, Ye Xiuchen, et al. A Selective Approach to Neural Network Ensemble Based on Clustering Technology // Proc of the International Conference on Machine Learning and Cybernetics. Banff, Canada, 2004: 3229-3233 [5] Dietterich T G. Machine Learning Research: Four Current Directions. AI Magazine, 1997, 18(4): 97-136 [6] Schapire R E. The Strength of Weak Learnability. Machine Learning, 1990, 5(2): 197-227 [7] Breiman L. Bagging Predictors. Machine Learning, 1996, 24(2): 123-140 [8] Ho T K. The Random Subspace Method for Constructing Decision Forests. IEEE Trans on Pattern Analysis and Machine Intelligence, 1998, 20(8): 832-844 [9] Opitz D W. Feature Selection for Ensembles // Proc of the 16th National Conference on Artificial Intelligence. Orlando, USA, 1999: 379-384 [10] Huang Y S, Suen C Y. A Method of Combining Multiple Experts for the Recognition of Unconstrained Handwritten Numerals. IEEE Trans on Pattern Analysis and Machine Intelligence, 1995, 17(1): 90-94 [11] Jie Z, Qiang J, Nagy G. A Comparative Study of Local Matching Approach for Face Recognition. IEEE Trans on Image Processing, 2007, 16(10): 2617-2628 [12] Verlinde P, Chollet G. Comparing Decision Fusion Paradigms Using k-NN Based Classifiers, Decision Trees and Logistic Regression in A Multi-Modal Identity Verification Application // Proc of the 2nd International Conference on Audio and Video-Based Biometric Person Authentication. Washington, USA: 1999: 188-193 [13] Kittler J, Hatef M, Duin R P W, et al. On Combining Classifiers. IEEE Trans on Pattern Analysis and Machine Intelligence, 1998, 20(3): 226-239 [14] Kuncheva L I, Bezdek J, Duin R. Decision Templates for Multiple Classifier Fusion: An Experimental Comparison. Pattern Recognition, 2001, 34(2): 299-314 [15] Cho S B, Kim J H. Multiple Network Fusion Using Fuzzy Logic. IEEE Trans on Neural Networks, 1995, 6(2): 497-501 [16] Kuncheva L I, Whitaker C J. Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy. Machine Learning, 2003, 51(2): 181-207 [17] Freund Y, Schapire R E. A Decision-Theoretic Generalization of On-line Learning and an Application to Boosting // Proc of the 2nd European Conference on Computational Learning Theory. Barcelona, Spain, 1995: 23-37 [18] Yule G U. On the Association of Attributes in Statistics. Philosophy Trans of the Royal Society of London, 1900, 194: 257-319 [19] Sneath P H, Sokal R R. Numerical Taxonomy. London, UK: Freeman W H, Co, 1973 [20] Skalak D. The Sources of Increased Accuracy for Two Proposed Boosting Algorithms [EB/OL]. [2009-04-15]. https://eprints.kfupm.edu.sa/71328/1/71328.pdf [21] Giacinto G, Roli F. Design of Effective Neural Network Ensembles for Image Classification Purposes. Image and Vision Computing, 2001, 19(9): 699-707 [22] Kuncheva L I. Combining Pattern Classifiers: Methods and Algorithms. Malden, USA: Wiley Interscience, 2004 |
|
|
|