|
|
Support Vector Data Description with Manifold Embedding |
CHEN Bin1,2, LI Bin2, PAN Zhi-Song3, CHEN Song-Can1 |
1.College of Information Science and Technology, Nanjing University of Aeronautics & Astronautics, Nanjing 210016 2.College of Information Engineering, Yangzhou University, Yangzhou 225009 3.Institute of Command Automation, PLA University of Science and Technology, Nanjing 210007 |
|
|
Abstract Geodesic distance is a good metric to approximate the underlying global geometry. However, support vector data description (SVDD) with geodesic distance cannot be directly optimized. A framework for manifold-based classifier is designed. The Euclid distance in the feature space induced by isometric feature mapping (ISOMAP) dimension reduction is approximated by the geodesic distance in the input space, and implicitly conducts the former learning algorithm (with Euclid distance) after the ISOMAP process. Next, the proposed method is extended to SVDD and a SVDD derivate with ISOMAP manifold embedding (mSVDD) is developed. Experimental results on USPS handwritten digital dataset show that compared with traditional Euclid distance based SVDD, mSVDD significantly increases the performance for one-class classification.
|
Received: 29 April 2008
|
|
|
|
|
[1] Markos M, Sameer S. Novelty Detection: A Review—Part I: Statistical Approaches. Signal Processing, 2003, 83(12): 2481-2497 [2] Chen Bin, Feng Aimin, Chen Songcan, et al. One-Cluster Clustering Based Data Description. Chinese Journal of Computers, 2007, 30(8): 1325-1332 (in Chinese) (陈 斌,冯爱民,陈松灿,等,基于单簇聚类的数据描述.计算机学报, 2007, 30(8): 1325-1332) [3] Tax D. One-Class Classification—Concept-Learning in the Absence of Counter-Examples. Ph.D Dissertation. Holland, Netherlands: Delft University of Technology. Faculty of Electrical Engineering, 2001 [4] Scholkpf B, Platt J, Shawe-Taylor J, et al. Estimating the Support of High-Dimensional Distribution. Neural Computation, 2001, 13(7): 1443-1471 [5] Tax D, Duin R. Support Vector Domain Description. Pattern Recognition Letters, 1999, 20(11/12/13): 1191-1199 [6] Tax D, Duin R. Support Vector Data Description. Machine Learning, 2004, 54(1): 45-66 [7] Quan Yong, Yang Jie. Modified Kernel Functions by Geodesic Distance. EURASIP Journal on Applied Signal Processing, 2004, 16(1): 2515-2521 [8] Duda R, Hart P, Stork D. Pattern Classification. 2nd Edition. New York, USA: Wiley-Interscience, 2001 [9] Saul L K. An Introduction to Locally Linear Embedding [EB/OL]. [2001-03-02]. http://www.cs.toronto.edu/ ~roweis/lle/papers/lleintroa4.pdf [10] Tenenbaum J B, de Silva V, Langford J C. A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science, 2000, 290(5500): 2319-2323 [11] Choi H, Choi S. Robust Kernel Isomap. Pattern Recognition, 2007, 40(3): 853-862 [12] Hautamki V, Krkkinen I, Frnti P. Outlier Detection Using k-Nearest Neighbour Graph // Proc of the 17th International Conference on Pattern Recognition. Cambridge, UK, 2004: 430-433 [13] Bradley A P. The Use of the Area under the ROC Curve in the Evaluation of Machine Learning Algorithms. Pattern Recognition, 1997, 30(7): 1145-1159 [14] Yan Lian, Dodier R, Mozer M C, et al. Optimizing Classifier Performance via the Wilcoxon-Mann-Whitney Statistic // Proc of the 20th International Conference on Machine Learning. Washington, USA, 2003: 848-855 |
|
|
|