Self-Regulation of Neighborhood Parameter for Locally Linear Embedding
HUI Kang-Hua1,2,XIAO Bai-Hua1,WANG Chun-Heng1
1.Key Laboratory of Complex Systems and Intelligence Science,Institute of Automation,Chinese Academy of Sciences,Beijing 100190 2.College of Computer Science and Technology,Civil Aviation University of China,Tianjin 300300
Abstract:The locally linear embedding (LLE) algorithm is considered as a powerful method for the problem of nonlinear dimensionality reduction. A method called self-regulated LLE is proposed. It finds the locally linear patch by analyzing the locally linear reconstruction errors of each sample in the dataset. Then, according to samples on the locally linear patch, it selects the appropriate neighborhood parameter for LLE. The experimental results show that LLE with self-regulation performs better than LLE based on different evaluation criteria and spends less time on several datasets.
[1] Jolliffe I T. Principal Component Analysis. New York, USA: Springer, 1989 [2] Cox T, Cox M. Multidimensional Scaling. London: UK: Chapman Hall, 1994 [3] He Xiaofei, Niyogi P. Locality Preserving Projections [EB/OL]. [2009-03-30]. http://books.nips.cc/papers/files/nips16/NIPS2003-AA20.pdf [4] Tenenbaum J B, de Silva V, Langford J C. A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science, 2000, 290(5500): 2319-2323 [5] Belkin M, Niyogi P. Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering // Dietterich T G, Becher S, Ghahramani Z, eds. Advances in Neural Information Processing Systems. Cambridge, USA: MIT Press, 2001, XIV: 585-591 [6] Roweis S T, Saul L K. Nonlinear Dimensionality Reduction by Locally Linear Embedding, Science, 2000, 290(5500): 2323-2326 [7] Kouropteva O, Okun O, Pietikinen M. Selection of the Optimal Parameter Value for the Locally Linear Embedding Algorithm // Proc of the 1st International Conference on Fuzzy Systems and Knowledge Discovery. Singapore, Singapore, 2002: 359-363 [8] Yang Li. Building k Edge-Disjoint Spanning Trees of Minimum Total Length for Isometric Data Embedding. IEEE Trans on Pattern Analysis and Machine Intelligence, 2005, 27(10): 1680-1683 [9] Pan Yaozhang, Ge S S, Mamun A A. Weighted Locally Linear Embedding for Dimension Reduction. Pattern Recognition, 2009, 42(5): 798-811 [10] Wen Guihua, Jiang Lijun, Wen Jun. Dynamically Determining Neighborhood Parameter for Locally Linear Embedding. Journal of Software, 2008, 19(7): 1666-1673 (in Chinese) (文贵华,江丽君,文 军.邻域参数动态变化的局部线性嵌入.软件学报, 2008, 19(7): 1666-1673) [11] Hou Yuexian, Wu Jingyi, He Pilian. Locally Adaptive Nonlinear Dimensionality Reduction. Journal of Computer Applications, 2006, 26(4): 895-897 (in Chinese) (侯越先,吴静怡,何丕廉.基于局域主方向重构的适应性非线性维数约减.计算机应用, 2006, 26(4): 895-897) [12] Geng Xin, Zhan Dechuan, Zhou Zhihua. Supervised Nonlinear Dimensionality Reduction for Visualization and Classification. IEEE Trans on Systems, Man and Cybernetics, 2005, 35(6): 1098-1107 [13] Wen Guihua, Jiang Lijun, Wen Jun, et al. Clustering-Based Nonlinear Dimensionality Reduction on Manifold // Proc of the 9th Pacific Rim International Conference on Artificial Intelligence. Guilin, China, 2006: 444-453 [14] Kouropteva O, Okun O, Hadid A, et al. Beyond Locally Linear Embedding Algorithm. Technical Report, MVG-01-2002, Oulu, Finland: University of Oulu. Machine Vision Group, 2002 [15] Saul L K, Roweis S T. Think Globally, Fit Locally: Unsupervised Learning of Nonlinear Manifolds. Journal of Machine Learning Research, 2003, 4: 119-155 [16] Kouropteva O, Okun O, Pietikinen M. Incremental Locally Linear Embedding. Pattern Recognition, 2005, 38(10): 1764-1767 [17] Data of MATLAB Nackers [DB/OL]. [2009-03-30].http://www.cs.toronto.edu/~roweis/data.html [18] Index of /ml/machine-learning-databases/iris [DB/OL]. [2009-03-30]. http://archive.ics.uci.edu/ml/machine-learning-databases/iris [19] Kayo O. Locally Linear Embedding Algorithm: Extensions and Applications [EB/OL]. [2009-03-30]. http://herkules.oulu.fi/isbn9514280415/isbn9514280415.pdf