|
|
Face Recognition Using Neighborhood Preserving Maximal Margin Analysis of Kernel Ridge Regression |
LI Yong-Zhou,LUO Da-Yong,LIU Shao-Qiang |
School of Information Science and Engineering,Central South University,Changsha 410083 |
|
|
Abstract Neighborhood preserving embedding is a linear approximation to locally linear embedding, and it emphasizes preserving the local structure of the data manifold. The modified maximal margin criterion focuses on the discriminant and geometrical structure of the data manifold, and it improves the classification performance of the data. An algorithm is proposed called neighborhood preserving maximal margin analysis of kernel ridge regression. It preserves the local structure of the manifold and maximizes margins between the data of different classes to construct the objective function. As the data manifold is highly nonlinear, the kernel ridge regression is adopted to calculate the transformation matrix. The mapped results of the data samples are obtained by the proposed algorithm in the kernel subspace firstly, then the kernel subspace is obtained. The experimental results on the standard face database demonstrate that the proposed algorithm is correct and effective. Moreover, it achieves better performance than the popular manifold learning algorithms.
|
Received: 18 September 2008
|
|
|
|
|
[1] Turk M, Pentland A. Eigenface for Recognition. Cognitive Neuroscience, 1991, 3(1): 72-86 [2] Belkin M, Niyoki P. Laplacian Eigenmaps for Dimensionality Reduction and Data Representation. Neural Computation, 2003, 15(6): 1373-1396 [3] He Xiaofei, Yan Shuicheng, Hu Yuxiang, et al. Face Recognition Using Laplacian Faces. Pattern Analysis and Machine Intelligence, 2005, 27(3): 328-340 [4] He Xiaofei, Cai Deng, Yan Shuicheng, et al. Neighborhood Preserving Embedding // Proc of the 10th IEEE International Conference on Computer Vision. Beijing, China, 2005, Ⅱ: 1208-1213 [5] Schlkopf B, Smola A, Muller K R. Nonlinear Component Analysis as a Kernel Eigenvalue Problem. Neural Computation, 1995, 10(5): 1299-1319 [6] Baudat G, Anouar F. Generalized Discriminant Analysis Using a Kernel Approach. Neural Computation, 2000, 12(10): 2385-2404 [7] Pang Yanwei, Yu Haineng, Sheng Daoyi, et al. Kernel Neighbor-
hood Preserving Projections for Face Recognition. Acta Electronica Sinica, 2006, 34(8): 1522-1524 (in Chinese) (庞彦伟,俞海能,沈道义,等.基于核邻域保持投影的人脸识别.电子学报, 2006, 34(8): 1522-1524) [8] An Senjian, Liu Wangquan, Venkatesh S. Face Recognition Using Kernel Ridge Regression // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Minneapolis, USA, 2007: 1110-1116 [9] Cai Deng, He Xiaofei, Han Jiawei. Efficient Kernel Discriminant Analysis via Spectral Regression // Proc of the 7th IEEE International Conference on Data Mining. Omaha, USA, 2007: 427-432 [10] Li Bo, Zheng Chunhou, Huang Deshuang. Locally Linear Discriminant Embedding: An Efficient Method for Face Recognition. Pattern Recognition, 2008, 41(12): 3813-3821 [11] Li Haifeng, Jiang Tao, Zhang Keshu. Efficient and Robust Feature Extraction by Maximum Margin Criterion. IEEE Trans on Neural Networks, 2006, 17(1): 157-165 [12] He Xiaofei, Cai Deng, Han Jiawei. Learning a Maximum Margin Subspace for Image Retrieval. IEEE Trans on Knowledge and Data Engineering, 2008, 20(2): 189-200 [13] Baudat G, Anouar F. Generalized Discriminant Analysis Using a Kernel Approach. Neural Computation, 2000, 12(10): 2385-2404 [14] Cawley G C, Talbot N L C, Foxall R J, et al. Heteroscedastic Kernel Ridge Regression. Neurocomputing, 2004, 57(3): 105-124 [15] Sun Ping, Yao Xin. Boosting Kernel Model for Regression // Proc of the 6th IEEE International Conference on Data Mining. Hongkong, China, 2006: 583-591 [16] Paige C C, Saunders M A. LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares. ACM Trans on Mathematical Software, 1982, 8(1): 43-71 [17] Osborne M R, Presnell B, Turlach B A. A New Approach to Variable Selection in Least Squares Problems. IMA Journal of Numerical Analysis, 2000, 20(3):389-403 [18] Hoerl A E, Kennard R W. Ridge Regression: Biased Estimation for Nonorthogonal Problems. Technometrics, 2000, 40(1): 80-86 [19] Golub G H, Heath M, Wahba G. Generalized Cross-Validation as a Method for Choosing a Good Ridge Regression Parameter. Technometrics, 1979, 21(2): 215-223 |
|
|
|