Abstract:Multiconlitron is a general framework for constructing piecewise linear classifiers. For the convexly separable and the commonly separable datasets, it can correctly separate them by using support conlitron algorithm(SCA) and multiconlition algorithm(SMA), respectively. On this basis, a soft margin method for multiconlitron design is proposed. Firstly, the training samples are mapped from the input space to a high dimensional feature space, and one class of those samples is clustered into some groups by K-means algorithm. Then, the conlitron is constructed between each group and another class of samples, and the integrated model, multiconlitron, is obtained. The proposed method can overcome the inapplicability of the original model to commonly inseparable datasets. By simplifying the model structure, the proposed method further improves the classification accuracy and the generalization ability. Experimental results show that the proposed method achieves better performance compared with some other piecewise linear classifiers and its effectiveness and advantages are verified.
[1] Webb D. Efficient Piecewise Linear Classifiers and Applications.Ph. D Dissertation. Ballarat, Australia: University of Ballarat,2010932 模式识别与人工智能摇摇摇26 卷 [2] Mangasarian O L. Multisurface Method of Pattern Separation. IEEETrans on Information Theory, 1968, 14(6): 801-807 [3] Herman G T, Yeung K T D. On PiecewiseLinear Classification.IEEE Trans on Pattern Analysis and Machine Intelligence, 1992,14(7): 782-786 [4] Sklansky J, Michelotti L. Locally Trained Piecewise Linear Classi fiers. IEEE Trans on Pattern Analysis and Machine Intelligence,1980, 2(2): 101-111 [5] Park Y, Sklansky J. Automated Design of MultipleClass Piecewise Linear Classifiers / / Proc of the 9th International Conference on Pa ttern Recognition. Rome, Italy, 1988, II: 1068-1071 [6] Tenmoto H, Kudo M, Shimbo M. Piecewise Linear Classifiers with an Appropriate Number of Hyperplanes. Pattern Recognition, 1998,31(11): 1627-1634 [7] Gai Kun, Zhang Changshui. Learning Discriminative Piecewise Li near Models with Boundary Points / / Proc of the 24th AAAI Confer ence on Artificial Intelligence. Georgia, USA, 2010: 444-450 [8] Cai Bingbing, Huang Tong, Zhuang Xinhua, et al. Piecewise Li near Classifiers Using Binary Tree Structure and Genetic Algorithm.Pattern Recognition, 1996, 29(11): 1905-1917 [9] Kostin A. A Simple and Fast MultiClass Piecewise Linear Pattern Classifier. Pattern Recognition, 2006, 39(11): 1949-1962 [10] Fukunage K. Introduction to Statistical Pattern Recognition. San Diego, USA: Academic Press, 1993 [11] Li Yujian, Liu Bo, Yang Xinwu, et al. Multiconlitron: A General Piecewise Linear Classifier. IEEE Trans on Neural Networks,2011, 22(2): 276-289 [12] Cortes C, Vapnik V. Support Vector Networks. Machine Learning,1995, 20(3): 273-297 [13] Vapnik V. The Nature of Statistical Learning Theory. New York,USA: SpringerVerlag, 1995 [14] Friess T T, Harrison R. Support Vector Neural Networks: The Ker nel Adatron with Bias and SoftMargin. Technical Report, ACSE TR725. Sheffield, UK: University of Sheffield, 1998 [15] Keerthi S S, Shevade S K, Bhattacharyya C, et al. A Fast Iterative Nearest Point Algorithm for Support Vector Machine Classifier De sign. IEEE Trans on Neural Networks, 2000, 11(1): 124-136 [16] Franc V, Hlavac V. An Iterative Algorithm Learning the Maximal Margin Classifier. Pattern Recognition, 2003, 36 (9): 1985-1996 [17] Hartigan J A, Wong M A. A KMeans Clustering Algorithm.Applied Statistics, 1979, 28(1): 100-108 [18] Bagirov A M. MaxMin Separability. Optimization Methods and Software, 2005, 20(2/3): 277-296 [19] Bagirov A M, Ugon J, Webb D. An Efficient Algorithm for the Incremental Construction of a Piecewise Linear Classifier. Informa tion Systems, 2011, 36(4): 782-790 [20] Bagirov A M, Ugon J, Webb D, et al. Classification through Incremental MaxMin Separability. Pattern Analysis and Applica tions, 2011, 14(2): 165-174 [21] Cover T M, Hart P E. Nearest Neighbor Pattern Classificatio