|
|
Multi-Sample Incremental Manifold Learning Algorithm Based on Isogonal Mapping |
TAN Chao1, GUAN Ji-Hong1, ZHOU Shui-Geng2 |
1.Computer Science and Technology Department, Tongji University, Shanghai 201804 2.Shanghai Key Laboratory of Intelligent Information Processing, School of Computer Science,Fudan University, Shanghai 200433 |
|
|
Abstract In the classical dimension reducing manifold learning algorithms, the distance is used to measure the similarity between data, and the problem of subspace deviation caused by noise can not be solved.A multi-sample incremental manifold learning algorithm based on Isogonal mapping is proposed. The covariance matrix of the high dimensional samples with sample mean as the center is turned into the covariance matrix with neighborhood mean as the center. Thus, the error of the subspace caused by distance measurement is eliminated, the covariance matrix is weighted, and the effect of noise or irregular new samples on dimension reduction is reduced. Experimental results show an improvement of the proposed algorithm compared with other algorithms. Moreover, the proposed algorithm can be well applied to image recognition.
|
Received: 13 May 2013
|
|
|
|
|
[1] Jolliffe I T. Principal Component Analysis. 2nd Edition. New York, USA: Springer-Verlag, 2002 [2] Roweis S T, Saul L K. Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science, 2000, 290(5500): 2323-2326 [3] Zhang Z Y, Zha H Y. Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment. SIAM Journal on Scientific Computing, 2004, 26(1): 313-338 [4] Liu S L, Yan D Q. A New Global Embedding Algorithm. Acta Automatica Sinica, 2011, 37(7): 828-835 (in Chinese) (刘胜蓝,闫德勤.一种新的全局嵌入降维算法.自动化学报, 2011, 37(7): 828-835) [5] Yang L. Alignment of Overlapping Locally Scaled Patches for Multidimensional Scaling and Dimensionality Reduction. IEEE Trans on Pattern Analysis and Machine Intelligence, 2008, 30(3): 438-450 [6] Zhang P, Hong Q, Zhang B. An Improved Local Tangent Space Alignment Method for Manifold Learning. Pattern Recognition Letters, 2011, 32(2): 181-189 [7] Donoho D L, Grimes C. Hessian Eigenmaps: Locally Linear Embedding Techniques for High-Dimensional Data. Proc of the National Academy of Sciences of the United States of America, 2003, 100(10): 5591-5596 [8] Li H S, Jiang H, Barrio R, et al. Incremental Manifold Learning by Spectral Embedding Methods. Pattern Recognition Letters, 2011, 32(10): 1447-1455 [9] Tenenbaum J B, de Silva V, Langford J C. A Global Geometric Framework for Nonlinear Dimension Reduction. Science, 2000, 290(5500): 2319-2323 [10] Zhang Y, Wang Y N, Li C S , et al. Kernel Based Incremental Learning Isomap Algorithm // Proc of the IEEE International Conference on Information and Automation. Changsha, China, 2008, VI: 184-189 [11] He X F, Niyogi P. Locality Preserving Projections // Thrun S, Saul L, Sch lkopf B, eds. Advances in the Neural Information Processing Systems. Cambridge, USA: MIT Press, 2003, 16: 153-160 [12] Qiao H , Zhang P, Wang D, et al. An Explicit Nonlinear Mapping for Manifold Learning. IEEE Trans on Systems, Man and Cybernetics, 2013, 43(1): 51-63 [13] Min W L, Lu K, He X F. Locality Pursuit Embedding. Pattern Recognition, 2004, 37(4): 781-788 [14] de Silva V, Tenenbaum J B. Global versus Local Methods in Nonlinear Dimensionality Reduction // Thrun S, Saul L, Sch lkopf B, eds. Advances in the Neural Information Processing Systems. Cambridge, USA: MIT Press, 2003, 16: 705-712 [15] Belkin M, Niyogi P. Laplacian Eigenmaps for Dimensionality Reduction and Data Representation. Neural Computation, 2003, 15(6): 1373-1396 |
|
|
|