Adaptive Regularization Based Kernel Two Dimensional Discriminant Analysis
JIANG Wei1, ZHANG Jing1, YANG Bing-Ru2
1.School of Mathematics, Liaoning Normal University, Dalian 1160292. 2.School of Computer and Communication Engineering, University of Science and Technology Beijing,Beijing 100083
Abstract:In traditional semi-supervised dimension reduction techniques, the manifold regularization term is defined in the original feature space. However, its construction is useless in the subsequent classification. In this paper, adaptive regularization based kernel two dimensional discriminant analysis (ARKTDDA) is presented. Firstly, each image matrix is transformed as the product of two orthogonal matrices and a diagonal matrix by using the singular value decomposition method. The column vectors of two orthogonal matrices are transformed into high dimensional space by two kernel functions. Then, the adaptive regularization is defined in the low dimensional feature space, and it is integrated with two dimensional matrix nonlinear method into one single objective function. By altering iterative optimization, the discriminative information is extracted in two kernel subspaces. Finally, experimental results on two face datasets demonstrate that the proposed algorithm obtains considerable improvement in classification accuracy.
[1] Turk M, Pentland A. Eigenfaces for Recognition. Journal of Cognitive Neuroscience, 1991, 3(1): 71-86 [2] Belhumeur P N, Hespanha J P, Kriegman D J. Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection. IEEE Trans on Pattern Analysis and Machine Intelligence, 1997, 19(7): 711-720 [3] Ji S W, Ye J P. Generalized Linear Discriminant Analysis: A Unified Framework and Efficient Model Selection. IEEE Trans on Neural Networks, 2008, 19(10): 1768-1782 [4] Yang J, Zhang D, Frangi A F, et al. Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Re-cognition. IEEE Trans on Pattern Analysis and Machine Intelligence, 2004, 26(1): 131-137 [5] Yang J, Zhang D, Xu Y, et al. Two-Dimensional Discriminant Transform for Face Recognition. Pattern Recognition, 2005, 38(7): 1125-1129 [6] Zhu F M, Zhang D Q. Semi-supervised Dimensionality Reduction Algorithm of Tensor Image. Pattern Recognition and Artificial Inte-lligence, 2009, 22(4): 574-580 (in Chinese) (朱凤梅,张道强.张量图像上的半监督降维算法.模式识别与人工智能, 2009, 22(4): 574-580) [7] Schlkopf B, Smola A, Müller K R. Kernel Principal Component Analysis // Proc of the 7th International Conference on Artificial Neural Networks. Lausanne, Switzerland, 1997: 583-588 [8] Mika S, Ratsch G, Weston J, et al. Fisher Discriminant Analysis with Kernels // Proc of the IEEE Signal Processing Society Workshop on Neural Networks for Signal Processing. Madison, USA, 1999, Ⅸ: 41-48 [9] Yan S C, Xu D, Zhang L, et al. Coupled Kernel-Based Subspace Learning // Proc of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition. San Diego, USA, 2005, I: 645-650 [10] Cai D, He X F, Han J W. Semi-supervised Discriminant Analysis[EB/OL]. [2013-08-30]. http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4408856 [11] Yin X S, Hu E L. Semi-supervised Locality Dimensionality Reduction. Journal of Image and Graphics, 2011, 16(9): 1615-1624 (in Chinese) (尹学松,胡恩良.半监督局部维数约减.中国图象图形学报, 2011, 16(9): 1615-1624) [12] Belkin M, Niyogi P, Sindhwani V. Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples. Journal of Machine Learning Research, 2006, 7(11): 2399-2434 [13] Zhou D Y, Bousquet O, Lal T N, et al. Learning with Local and Global Consistency[EB/OL]. [2013-08-30]. http://research.microsoft.com/en-us/um/people/denzho/papers/LLGC.pdf [14] Zhang L M, Qiao L S, Chen S C. Graph-Optimized Locality Preserving Projections. Pattern Recognition, 2010, 43(6): 1993-2002 [15] He X F, Niyogi P. Locality Preserving Projections[EB/OL]. [2003-08-30]. http://papers.nips.cc/paper/2359-locality-preserving-projections.pdf [16] Xu D, Yan S C. Semi-supervised Bilinear Subspace Learning. IEEE Trans on Image Processing, 2009, 18(7): 1671-1676