Improved Covariance Feature Based Lie-KNN Classification Algorithm
WANG Bang-Jun1,2, LI Fan-Zhang2, ZHANG Li2, YU Jian1, HE Shu-Ping2
1.School of Computer and Information Technology, Beijing Jiaotong University, Beijing 100044 2.School of Computer Science and Technology, Soochow University, Suzhou 215006
Abstract:K-nearest neighbor(KNN) classification is simple, efficient and widely used for classification problems or as a base of comparison. However, the data, especially those with complex high-dimensional structures, do not always belong to the Euclidean space in practical application. How to select the features of samples and calculate the distances between them is a hard problem in KNN. With full consideration of various factors, a multi-covariance Lie-KNN classification method is put forward based on the image region covariance. In this method, the simplicity and the validity of KNN and the abilities of Lie group structure to represent complex data and calculate distances are fully used. It effectively solves the classification problems of complex high-dimensional data. Experimental results on handwritten numerals verify its effectiveness.
[1] Li H. Statistical Learning Methods. Beijing, China: Tsinghua University Press, 2012 (in Chinese) (李 航.统计学习方法.北京:清华大学出版社, 2012) [2] Rosenfeld A, Vanderburg G. Coarse-Fine Template Matching. IEEE Trans on Systems, Man and Cybernetics, 1977, 7(2): 104-107 [3] Brunelli R, Poggio T. Face Recognition: Features versus Templates. IEEE Trans on Pattern Analysis and Machine Intelligence, 1993, 15(10): 1042-1052 [4] Maree R, Geurts P, Piater J, et al. Random Subwindows for Robust Image Classification // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. San Diego, USA, 2005, I: 34-40 [5] Porikli F. Integral Histogram: A Fast Way to Extract Histograms in Cartesian Spaces // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. San Diego, USA, 2005, I: 829-836 [6] Viola P, Jones M. Rapid Object Detection Using a Boosted Cascade of Simple Features // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Kauai, USA, 2001, I: 511-518 [7] Tuzel O, Porikli F, Meer P. Region Covariance: A Fast Descriptor for Detection and Classification // Proc of the 9th European Conference on Computer Vision. Graz, Austria, 2006: 589-600 [8] Weinberger K Q, Blitzer J, Saul L K. Distance Metric Learning for Large Margin Nearest Neighbor Classification[EB/OL]. [2013-03-03]. http://machinelearning.wustl.edu/mlpapers/paper_files/NIPS2005_265.pdf [9] Goldberger J, Roweis S, Hinton G, et al. Neighbourhood Components Analysis // Saul L, Weiss Y, Bottou L, eds. Advances in Neural Information Processing Systems. Cambridge, USA: MIT Press, 2005, 17: 513-520 [10] Chopra S, Hadsell R, LeCun Y. Learning a Similarity Metric Discriminatively, with Application to Face Verification // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Piscataway, USA, 2005, I: 539-546 [11] Xing E P, Ng A Y, Jordan M I, et al. Distance Metric Learning with Application to Clustering with Side-Information[EB/OL]. [2013-04-15]. http://books.nips.cc/papers/files/nips15/AA03.pdf [12] Shalev-Shwartz S, Singer Y, Ng A Y. Online and Batch Learning of Pseudo-Metrics // Proc of the 21st International Conference on Machine Learning. Banff, Canada, 2004: 94-101 [13] Globerson A, Roweis S. Metric Learning by Collapsing Classes // Weiss Y, Sch lkopf B, Platt J, eds. Advances in Neural Information Processing Systems. Cambridge, USA: MIT Press, 2006, 18: 451-458 [14] Davis J V, Kulis B, Jain P, et al. Information-Theoretic Metric Learning // Proc of the International Conference on Machine Learning. New York, USA: ACM Press, 2007: 209-216 [15] Arsigny V, Fillard P, Pennec X, et al. Geometric Means in a Novel Vector Space Structure on Symmetric Positive-Definite Matrices. SIAM Journal on Matrix Analysis and Applications, 2007, 29(1): 328-347 [16] Li F Z, Qian X P,Xie L, et al. Machine Learning Theory and Application. Hefei, China: University of Science and Technology of China Press, 2009 (in Chinese) (李凡长,钱旭培,谢 琳,等.机器学习理论及应用.合肥:中国科学技术大学出版社, 2009) [17] Fletcher T P, Lu C, Joshi S C. Statistics of Shape via Principal Geodesic Analysis on Lie Groups // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Madison, USA, 2003, I: 95-101 [18] Fletcher T P, Lu C, Pizer S M, et al. Principal Geodesic Analysis for the Study of Nonlinear Statistics of Shape. IEEE Trans on Medical Imaging, 2004, 23(8): 995-1005 [19] Yarlagadda P, Ozcanli O, Mundy J. Lie Group Distance Based Generic 3D Vehicle Classification // Proc of the International Conference on Pattern Recognition. Tampa, USA, 2008: 1-4 [20] Hartigan J A, Wong M A. A K-means Clustering Algorithm. Applied Statistics, 1979, 28(1): 100-108 [21] Gao C, Li F Z. Lie Group Means Learning Algorithm. Pattern Recognition and Artificial Intelligence, 2012, 25(6): 900-908 (in Chinese) (高 聪,李凡长.李群均值学习算法.模式识别与人工智能, 2012, 25(6): 900-908) [22] F rstner W, Moonen B. A Metric for Covariance Matrices [EB/OL].[2013-03-03]. http://www.uni-stuttgart.de/gi/research/schriftenreihe/quo-vadis/pdf/foerstner.pdf [23] Bruni V, Rossi E, Vitulano D. Image Restoration via Human Perception and Lie Groups // Proc of the International Conference on Computer Vision Theory and Applications. Rome, Italy, 2012, I: 66-74 [24] Bigot J, Christophe C, Gadat S. Random Action of Compact Lie Groups and Minimax Estimation of a Mean Pattern. IEEE Trans on Information Theory, 2012, 58(6): 3509-3520 [25] Wang C M , Jascha S, Tosic I, et al. Lie Group Transformation Models for Predictive Video Coding // Proc of the Data Compression Conference. Snowbird, USA, 2011: 83-92 [26] Jean-Baptiste A, Rémi M, Yannick B. Automatic Estimation of Asymmetry for Gradient-Based Alignment of Noisy Images on Lie Group. Pattern Recognition Letters, 2011, 32(10): 1480-1492 [27] Xu Qiang, Ma Dengwu. Applications of Lie Groups and Lie Algebra to Computer Vision: A Brief Survey[EB/OL]. [2013-03-01]. http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=06223449