SANG Nong1, XIE YanTao1, GAO RuXin1, ZHANG TianXu2
1.Institute for Pattern Recognition and Artificial Intelligence, Huazhong University of Science and Technology, Wuhan 430074 2.Key Laboratory of Ministry of Education for Image Processing and Intelligent Control, Huazhong University of Science and Technology, Wuhan 430074
Abstract:To some extent, the feature selection algorithms based on artificial neural networks can be regarded as the special cases of the architecture pruning algorithms. However, they usually require preprocessing of data normalization, which may change the distribution of the original data which is important to the classification. Neurofuzzy networks are fuzzy inference systems with selfstudy ability. In this paper it is combined with the architecture pruning algorithm based on membership space and a new feature selection algorithm is proposed. The membership functions of the algorithm are learned adaptively, and the learning process is finished before the feature selection. Experiments on natural and synthesized data are given and compared with some traditional techniques. The results show that the proposed method is superior to the traditional ones.
[1] Wang L C, Der S Z, Nasrabadi N M. Automatic Target Recognition Using a Feature-Decomposition and Data-Decomposition Modular Neural Network. IEEE Trans on Image Processing, 1998, 7(8): 1113-1121 [2] Viola P, Jones M. Rapid Object Detection Using a Boosted Cascade of Simple Features // Proc of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition. Kauai, USA, 2001, Ⅰ: 511-518 [3] Dash M, Liu H. Feature Selection for Classification. Intelligent Data Analysis, 1997, 1(3): 131-156 [4] Reed R. Pruning Algorithms-A Survey. IEEE Trans on Neural Networks, 1993, 4(5): 740-747 [5] Verikas A, Bacauskiene M. Feature Selection with Neural Networks. Pattern Recognition Letters, 2002, 23(11): 1323-1335 [6] Ruck D W, Rogers S K, Kabrisky M. Feature Selection Using a Multilayer Perceptron. Journal of Neural Network Computing, 1990, 2(1): 40-48 [7] Mao J C, Mohiuddin K, Jain A K. Parsimonious Network Design and Feature Selection through Node Prunning // Proc of the 12th IAPR International Conference on Pattern Recognition. Jerusalem, Israel, 1994, Ⅱ: 622-624 [8] Jia Pei, Sang Nong. Feature Selection Using a Radial Basis Function Networks and Fuzzy Set Theoretic Measures // Proc of the 3th Symposium on Multispectral Image Processing and Pattern Recognition. Beijing, China, 2003:109-114 [9] De R K, Pal N R, Pal S K. Feature Analysis: Neural Network and Fuzzy Set Theoretic Approaches. Pattern Recognition, 1997, 30(10): 1579-1590 [10] Duda R O, Hart P E, Stork D G. Pattern Classification. New York, USA: Wiley & Son, 1998 [11] Rezaee M R, Goedhart B, Lelieveldt B P F, et al. Fuzzy Feature Selection. Pattern Recognition, 1999, 32(3): 2011-2019 [12] Sun C T, Jang J S. A Neuro-Fuzzy Classifier and Its Applications// Proc of the 2nd IEEE International Conference on Fuzzy Systems. San Francisco, USA, 1993, Ⅰ: 94-98 [13] Chakraborty D, Pal N R. A Neuro-Fuzzy Scheme for Simultaneous Feature Selection and Fuzzy Rule-Based Classification. IEEE Trans on Neural Networks, 2004, 15(1): 110-123 [14] Kim I I, Lee J H, Bang E O. A New Approach to Adaptive Membership Function for Fuzzy Inference System // Proc of the 3rd International Conference on Knowledge-Based Intelligent Infomation Engineering Systems. Adelaide, Australia, 1999: 112-116 [15] Anderson E. The Irises of the Gaspe Peninsula. Bulletin of the American Iris Society, 1935, 59: 2-5 [16] Belue L M, Bauer K W. Determining Input Feature for Multilayer Perceptrons. Neurocomputing, 1995, 7(2): 111-121 [17] Guyon I. Design of Experiments for the NIPS 2003 Variable Selection Benchmark [EB/OL]. [2003-07-01]. http://www.clopinet.com/isabelle/Projects/NIPS2003/Slides/NIPS2003-Datasets. pdf [18] Foley D H. Consideration of Sample and Feature Size. IEEE Trans on Information Theory, 1972, 18(5): 618-626