|
|
Improved Kernel Minimum Squared Error Method and Its Implementations |
XU Yong1, LU JianFeng2, JIN Zhong2, YANG JingYu2 |
1.Shenzhen Graduate School, Harbin Institute of Technology, Shenzhen 518055 2.Department of Computer Science and Technology, Nanjing University of Science and Technology, Nanjing 210094 |
|
|
Abstract On the basis of the fact that the discriminant vector of the feature space associated with the kernel minimum squared error (KMSE) model can be expressed in terms of a linear combination of samples selected from all the training samples, the idea of variable selection can be exploited to improve the KMSE model. To improve the classification efficiency, an algorithm based on the minimum square error criterion is proposed. It classifies test samples efficiently. Experiments show that the proposed method also has good classification performance.
|
Received: 09 February 2006
|
|
|
|
|
[1] Xu Jianhua, Zhang Xuegong, Li Yanda. Kernel MSE Algorithm: A Unified Framework for KFD, LSSVM and KRR // Proc of the International Joint Conference on Neural Networks. Washington, USA, 2001: 14861491 [2] Bian Zhaoqi, Zhang Xuegong. Pattern Recognition. Beijing, China: Tsinghua University Press, 2000 (in Chinese) (边肇祺,张学工.模式识别.北京:清华大学出版社, 2000) [3] Billings S A, Lee K L. Nonlinear Fisher Discriminant Analysis Using a Minimum Squared Error Cost Function and the Orthogonal Least Squares Algorithm. Neural Networks, 2002, 15(2): 263270 [4] Chen S, Hong X, Harris C J. Sparse Kernel Regression Modeling Using Combined Locally Regularized Orthogonal Least Squares and DOptimality Experimental Design. IEEE Trans on Automatic Control, 2003, 48(6): 10291036 [5] Mika S, Ratsch G, Weston J, et al. Fisher Discriminant Analysis with Kernels // Proc of the IEEE Signal Processing Society Workshop on Neural Networks for Signal Processing. Madison, USA, 1999, IX: 4148 [6] Mika S, Smola A J, Scholkopf B. An Improved Training Algorithm for Kernel Fisher Discriminants // Proc of the 8th International Workshop on Artificial Intelligence and Statistics. San Francisco, USA, 2001: 98104 [7] Xu Yong, Yang Jingyu, Lu Jianfeng. An Efficient KernelBased Nonlinear Regression Method for TwoClass Classification // Proc of the International Conference on Machine Learning and Cybernetics. Guangzhou, China, 2005, Ⅶ: 44424445 [8] Tipping M E. Sparse Bayesian Learning and the Relevance Vector Machine. Journal of Machine Learning Research, 2001, 1(3): 211244 [9] Xu Yong, Yang Jingyu, Yang Jian. A Reformative Kernel Fisher Discriminant Analysis. Pattern Recognition, 2004, 37(6): 12991302 [10] Xu Yong, Yang Jingyu, Lu Jianfeng, et al. An Efficient Renovation on Kernel Fisher Discriminant Analysis and Face Recognition Experiments. Pattern Recognition, 2004, 37(10): 20912094 [11] Xu Yong, Zhang D, Jin Zhong, et al. A Fast KernelBased Nonlinear Discriminant Analysis for MultiClass Problems. Pattern Recognition, 2006, 39(6): 10261033 [12] Chen Xiru, Wang Songgui. Modern Pratical Regression Analysis. Nanning, China: Guangxi Renmin Press, 1984 (in Chinese) (陈希孺,王松桂. 近代实用回归分析.南宁:广西人民出版社, 1984) [13] Golub G H, van Loan C F. Matrix Computations. 3rd Edition. Baltimore, UK: John Hopkins University Press, 1996 [14] van Gestel T, Suykens J A K, Lanckriet G, et al. Bayesian Framework for LeastSquares Support Vector Machine Classifiers, Gaussian Processes, and Kernel Fisher Discriminant Analysis. Neural Computation, 2002, 14(5): 11151147 [15] Suykens J A K, Vandewalle J. Least Squares Support Vector Machine Classifiers. Neural Processing Letters, 1999, 9(3): 293300 |
|
|
|