A Latent Variable Model Based on Local Preservation

WANG Xiu-Mei,GAO Xin-Bo,ZHANG Qian-Kun,SONG Guo-Xiang

PDF(477 KB)
Pattern Recognition and Artificial Intelligence ›› 2010, Vol. 23 ›› Issue (3) : 369-375.
Orignal Article

A Latent Variable Model Based on Local Preservation

  • WANG Xiu-Mei1,2,GAO Xin-Bo2,ZHANG Qian-Kun2,SONG Guo-Xiang1
Author information +
History +

Abstract

Latent variable model (LVM) is a kind of efficient nonlinear dimensionality reduction algorithm through establishing smooth kernel mappings from the latent space to the data space. However, this kind of mappings cannot keep the points close in the latent space even they are close in data space. A LVM is proposed based on locality preserving projection (LPP) which can preserve the locality structure of dataset. The objective function of LPP is considered as a prior of the variables in the Gaussian process latent variable model (GP-LVM). The proposed locality preserving GP-LVM is built with the constrained term of the objective function. Compared with the traditional LPP and GP-LVM, experimental results show that the proposed method performs better in preserving local structure on common data sets.

Key words

Dimensionality Reduction / Latent Variable Model (LVM) / Local Distance Preservation

Cite this article

Download Citations
WANG Xiu-Mei,GAO Xin-Bo,ZHANG Qian-Kun,SONG Guo-Xiang. A Latent Variable Model Based on Local Preservation. Pattern Recognition and Artificial Intelligence. 2010, 23(3): 369-375

References

[1] Bartholomew D J. Statistical Factor Analysis and Related Methods. New York, USA: Wiley, 2004
[2] Tipping M E, Bishop C M. Probabilistic Principal Component Analysis. Journal of the Royal Statistical Society: Series B, 1999, 61(3): 611-622
[3] Schlkopf B, Smola A J. Müller K R. Nonlinear Component Analysis as a Kernel Eigenvalue Problem. Neural Computation, 1998, 10(5): 1299-1319
[4] Lin Tong, Zha Hongbin, Lee S. Riemannian Manifold Learning for Nonlinear Dimensionality Reduction // Proc of the European Conference on Computer Vision. Graz, Austria, 2006: 44-55
[5] Tenenbaum J B, de Silva V, Langford J C. A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science, 2000, 290(5500): 2319-2323
[6] Lawrence N D. Probabilistic Non-Linear Principal Component Analysis with Gaussian Process Latent Variable Models. Journal of Machine Learning Research, 2005, 6: 1783-1816
[7] Lawrence N D. Gaussian Process Models for Visualization for High Dimensional Data // Thrun S, Saul L K, Schlkopf B, eds. Advance in Neural Information Processing Systems. Cambridge, USA: MIT Press, 2004, XVI: 329-336
[8] Lawrence N D, Quinonero-Candela J. Local Distance Preservation in the GP-LVM through Back Constraints // Proc of the 23rd International Conference on Machine Learning. Pittsburgh, USA, 2006: 513-520
[9] He Xiaofei, Niyogi P. Locality Preserving Projections // Thrun S, Saul L K, Schlkopf B, eds. Advances in Neural Information Processing Systems. Cambridge, USA: MIT Press, 2004: 626-632
[10] Roweis S T, Sau L K. Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science, 2000, 290(5500): 2323-2326
[11] He Xiaofei, Cai Deng, Yan Shuicheng, et al. Neighborhood Preserving Embedding // Proc of the 10th International Conference on Computer Vision. Beijing, China, 2005, Ⅱ: 1208-1213
PDF(477 KB)

709

Accesses

0

Citation

Detail

Sections
Recommended

/