Abstract:Support vector data description (SVDD) is an unsupervised learning method with significant application in image recognition and information security. Coordinate descent is an effective method for large-scale classification problems with simple operation and high convergence speed. In this paper, an efficient coordinate descent algorithm for solving large-scale SVDD is presented. The solution of concerned sub-problem at each iteration is derived in closed form and the computational cost is decreased through the accelerating strategy and cheap computation. Meanwhile, three methods for selecting sub-problem, analyzing and comparing their advantage and disadvantage are developed. The experiments on simulation and real large-scale database validate the performance of the proposed algorithm. Compared with LibSVDD, the proposed algorithm has great superiority which takes less than 1.4 seconds to solve a text database from ijcnn with 105 training examples.
[1] Tax D M J,Duin R P W.Support Vector Domain Description.Pat- tern Recognition Letters,1999,20(11/12/13): 1191-1199 [2] Schlkopf B,Platt J,Shawe-Taylor J,et al.Estimating the Support of a High-Dimensional Distribution .Neural Computation,2001,13(7): 1443-1471 [3] Tax D M J,Duin R P W.Support Vector Data Description.Machine Learning,2004,54: 45-66 [4] Tsang I W,Kwok J T,Cheung P M.Core Vector Machines: Fast SVM Training on Very Large Data Sets.Journal of Machine Learning Research,2005,6: 363-392 [5] Loosli G,Canu S.Comments on the “Core Vector Machines: Fast SVM Training on Very Large Data Sets”.Journal of Machine Learning Research,2007,8: 291-301 [6] Hsieh C J,Chang K W,Lin C J,et al.A Dual Coordinate Descent Method for Large-Scale Linear SVM // Proc of the 25th International Conference on Machine Learning.Helsinki,Finland,2008 : 408-415 [7] Zhang T.Solving Large Scale Linear Prediction Problems Using Stochastic Gradient Descent Algorithms // Proc of the 21st International Conference on Machine Learning. Haifa,Israel,2004: 919-926 [8] Shalev-Shwartz S,Singer Y,Srebro N.Pegasos: Primal Estimated Sub-Gradient Solver for SVM // Proc of the 24th International Conference on Machine Learning.Corvallis,USA,2007: 807-814 [9] Bordes A,Bottou L.Careful Quasi-Newton Stochastic Gradient Descent.Journal of Machine Learning Research,2009,10: 1737-1754 [10] Joachims T.Training Linear SVMs in Linear Time // Proc of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.Philadelphia,USA,2006: 82-95 [11] Smola A J,Vishwanathan S V N,Le Q.Bundle Methods for Machine Learning // Platt J C,Koller D,Singer Y,et al,eds.Advances in Neural Information Processing Systems.Cambridge,USA: MIT Press,2007,XX: 1377-1384 [12] Lin C J,Weng R C,Keerthi S S.Trust Region Newton Method for Large-Scale Logistic Regression.Journal of Machine Learning Research,2008,9: 627-650 [13] Chang K W,Hsieh C J,Lin C J.Coordinate Descent Method for Large-Scale L2-Loss Linear SVM.Journal of Machine Learning Research,2008,9: 1369-1398 [14] Wu Yichao,Liu Yufeng.Robust Truncated Hinge Loss Support Vector Machines.Journal of the American Statistical Association,2007,102(479): 974-983 [15] Wu M,Ye J.A Small Sphere and Large Margin Approach for Novelty Detection Using Training Data with Outliers.IEEE Trans on Pattern Analysis and Machine Intelligence,2009,31(11): 2088-2092 [16] Saha A,Tewari A.On the Finite Time Convergence of Cyclic Coordinate Descent Methods [EB/OL].[2011-06-27].http://arxiv.org/abs/1005.2146 [17] Nesterov Y.Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems[EB/OL].[2010-01-01].http://130.104.5.100/cps/ucl/doc /core/documents/coredp2010_2web.pdf