Abstract:A new algorithm is presented to improve the classification ability of the radial basis function neural network (RBFNN). It attempts to construct RBFNN based on a cooperative coevolutionary algorithm. The Kmeans method is employed and the initial hidden nodes are divided into modules to represent the species of the coevolutionary algorithms. The good individuals in all species are found and then combined to form the whole structure of RBFNN. A matrixform mixed encoding scheme with a control vector is adopted in this algorithm. The weights between the hidden layer and the output layer are calculated by pseudoinverse algorithm. The proposed algorithm is tested on UCI datasets and the results show it outperforms the other existing methods with higher accuracy and simpler network construction.
田津,李敏强,陈富赞. 基于合作型协同进化的RBFNN分类算法*[J]. 模式识别与人工智能, 2008, 21(1): 88-97.
TIAN Jin, LI MinQiang, CHEN FuZan. A Classification Algorithm for RBFNN Based on Cooperative Coevolution. , 2008, 21(1): 88-97.
[1] Arifovic J, Gencay R. Using Genetic Algorithms to Select Architecture of a Feedforward Artificial Neural Network. Physica A, 2001, 289(3): 574594 [2] Boozarjomehry R B, Svrcek W Y. Automatic Design of Neural Network Structures. Computers and Chemical Engineering, 2001, 25(7): 10751088 [3] Blanco A, Delgado M, Pegalajar M C. A RealCoded Genetic Algorithm for Training Recurrent Neural Networks. Neural Networks, 2001, 14(1): 93105 [4] Morrison J, Oppacher F. A General Model of CoEvolution for Genetic Algorithms // Proc of the 4th International Conference on Artificial Neural Networks and Genetic Algorithms. Portoroz, Slovenia, 1999: 262268 [5] Li Bi, Yong Zhengzheng, Zhou Anning. An Embeddable Coevolutionary Model. Computer Engineering and Applications, 2005, 41(9): 6263,228 (in Chinese) (李 碧,雍正正,周安宁.一种嵌入式的协同进化模型.计算机工程与应用, 2005, 41(9): 6263,228) [6] Iorio A, Li Xiaodong. Parameter Control within a CoOperative CoEvolutionary Genetic Algorithm // Proc of the 7th International Conference on Parallel Problem Solving from Nature. Granada, Spain, 2002: 247256 [7] Eriksson R, Olsson B. Cooperative Coevolution in Inventory Control Optimization // Proc of the 3rd International Conference on Artificial Neural Networks and Genetic Algorithms. Norwich, UK, 1997: 583587 [8] Potter M A, de Jong K A. The Coevolution of Antibodies for Concept Learning // Proc of the 5th International Conference on Parallel Problem Solving from Nature. Amsterdam, Netherlands, 1998: 530539 [9] Potter M A, de Jong K A. Cooperative Coevolution: An Architecture for Evolving Coadapted Subcomponents. Evolutionary Computation, 2000, 8(1): 129 [10] GarcíaPedrajas N, HervásMartínez C, OrtizBoyer D. Cooperative Coevolution of Artificial Neural Network Ensembles for Pattern Classification. IEEE Trans on Evolutionary Computation, 2005, 9(3): 271302 [11] Mitchell T M. Machine Learning. New York, USA: McGrawHill, 1997 [12] Berthold M R, Diamond J. Boosting the Performance of RBF Networks with Dynamic Decay Adjustment // Tesauro G, Touretzky D S, Leen T K, eds. Advances in Neural Information Processing Systems. Cambridge, USA: MIT Press, 1995, 7: 512528 [13] Casasent D, Chen Xuewen. Radial Basis Function Neural Networks for Nonlinear Fisher Discrimination and NeymanPearson Classification. Neural Networks, 2003, 16(5/6): 529535 [14] Zhao Weixiang, Wu Lide. RBFN Structure Determination Strategy Based on PLS and GAs. Journal of Software, 2002, 13(8): 14501455 (in Chinese) (赵伟祥,吴立德.基于PLS和GAs的径基函数网络构造策略.软件学报, 2002, 13(8): 14501455) [15] Kanungo T, Mount D M, Netanyahu N S, et al. An Efficient Kmeans Clustering Algorithm: Analysis and Implementation. IEEE Trans on Pattern Analysis and Machine Intelligence, 2002, 24(7): 881892 [16] Bosman P A N, Thierens D. The Balance between Proximity and Diversity in Multiobjective Evolutionary Algorithms. IEEE Trans on Evolutionary Computation, 2003, 7(2): 174188 [17] Liu Y, Yao X, Higuchi T. Evolutionary Ensembles with Negative Correlation Learning. IEEE Trans on Evolutionary Computation, 2000, 4(4): 380387 [18] Li Minqiang, Kou Jisong, Lin Dan, et al. The Basic Theories and Applications in GA. Beijing, China: Science Press, 2002 (in Chinese) (李敏强,寇纪淞,林 丹,等.遗传算法的基本理论与应用.北京:科学出版社, 2002) [19] Frank E, Wang Y, Inglis S, et al. Using Model Trees for Classification. Machine Learning, 1998, 32(1): 6376 [20] Merz C J. Using Correspondence Analysis to Combine Classifiers. Machine Learning, 1999, 36(1): 3358 [21] Dietterich T G. An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization. Machine Learning, 2000, 40(2): 139157 [22] CantúPaz E, Kamath C. Inducing Oblique Decision Trees with Evolutionary Algorithms. IEEE Trans on Evolution Computation, 2003, 7(1): 5468 [23] Guo Gongde, Wang Hui, Bell D, et al. KNN ModelBased Approach in Classification // Meersman R, Tari Z, Schmidt D C, eds. Lecture Notes in Computer Science. Berlin, Germany: Springer, 2003, 2888: 986996 [24] Friedman J, Hastie T, Tibshirani R. Additive Logistic Regression: A Statistical View of Boosting. Annals of Statistics, 2000, 28(2): 337407 [25] Draghici S. The Constraint Based Decomposition (CBD) Training Architecture. Neural Networks, 2001, 14(4/5): 527550 [26] Webb G I. Multiboosting: A Technique for Combining Boosting and Wagging. Machine Learning, 2000, 40(2): 159196 [27] Yang J, Parekh R, Honavar V. DistAI: An InterPattern DistanceBased Constructive Learning Algorithm. Intelligent Data Analysis, 1999, 3(1): 5573 [28] Frank E, Witten I H. Generating Accurate Rule Sets without Global Optimization // Proc of the 15th International Conference on Machine Learning. Madison, USA, 1998: 144151