|
|
Linearity Property Testing Approach to Gaussian Kernel Selection |
HAN Zhizhuo, LIAO Shizhong |
School of Computer Science and Technology, Tianjin University, Tianjin 300350 |
|
|
Abstract Kernel selection is critical to the performance of kernel methods. The computational complexity of the existing approaches to Gaussian kernel selection is Ω(n2). Therefore, it is an impediment to the development of large-scale kernel methods. To address this issue, a linearity property testing approach to Gaussian kernel selection is proposed. Completely different from the existing approaches, the proposed approach only needs O(ln(1/δ)/ 2) query complexity, and its computational complexity is independent of the sample size. Firstly, a concept called linearity level is defined. It is proved that linearity level can approximate the distance between a function and the linear function class, and the linearity property testing criterion for Gaussian kernel selection is presented via the concept of linearity level and the approximate distance. The linearity property testing criterion can be applied in random Fourier feature space to assess and select a suitable Gaussian kernel. Theoretical and experimental results demonstrate that the linearity property testing approach to Gaussian kernel selection is feasible and effective.
|
Received: 05 May 2017
|
|
About author:: (HAN Zhizhuo, born in 1992, master student. Her research interests include li-nearity property testing and kernel methods.) (LIAO Shizhong(Corresponding author), born in 1964, Ph.D., professor. His research interests include artificial intelligence and theoretical computer science.) |
|
|
|
[1] GUYON I, SAFFARI A, DROR G, et al. Model Selection: Beyond the Bayesian/Frequentist Divide. Journal of Machine Learning Research, 2010, 11: 61-87. [2] DUAN K B, KEERTHI S S, POO A N. Evaluation of Simple Performance Measures for Tuning SVM Hyperparameters. Neurocompu-ting, 2003, 51: 41-59. [3] HUANG C L, WANG C J. A GA-Based Feature Selection and Parameters Optimization for Support Vector Machines. Expert Systems with Applications, 2006, 31(2): 231-240. [4] FRIEDRICHS F, IGEL C. Evolutionary Tuning of Multiple SVM Parameters. Neurocomputing, 2005, 64: 107-117. [5] VAPNIK V, CHAPELLE O. Bounds on Error Expectation for Support Vector Machines. Neural Computation, 2000, 12(9): 2013-2036. [6] CHAPELLE O, VAPNIK V, BOUSQUET O, et al. Choosing Multiple Parameters for Support Vector Machines. Machine Learning, 2002, 46(1/2/3): 131-159. [7] KEERTHI S S. Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms.IEEE Transactions on Neural Networks, 2002, 13(5): 1225-1229. [8] XU Z B, DAI M W, MENG D Y. Fast and Efficient Strategies for Model Selection of Gaussian Support Vector Machine. IEEE Transactions on Systems, Man, and Cybernetics(Cybernetics), 2009, 39(5): 1292-1307. [9] LIU Y, LIAO S Z. Eigenvalues Ratio for Kernel Selection of Kernel Methods // Proc of the 29th AAAI Conference on Artificial Intelligence.Palo Alto, USA: AAAI Press, 2015: 2814-2820. [10] WIDODO A, BUDI I, WIDJAJA B. Automatic Lag Selection in Time Series Forecasting Using Multiple Kernel Learning. International Journal of Machine Learning and Cybernetics, 2016, 7(1): 95-110. [11] 丁立中,廖士中.基于正则化路径的支持向量机近似模型选择.计算机研究与发展, 2012, 49(6): 1248-1255. (DING L Z, LIAO S Z. Approximate Model Selection on Regularization Path for Support Vector Machines. Journal of Computer Research and Development, 2012, 49(6): 1248-1255.)
[12] DING L Z, LIAO S Z. An Approximate Approach to Automatic Kernel Selection. IEEE Transactions on Cybernetics, 2017, 47(3): 554-565. [13] 冯 昌,廖士中.随机傅里叶特征空间中高斯核支持向量机模型选择.计算机研究与发展, 2016, 53(9): 1971-1978. (FENG C, LIAO S Z. Model Selection for Gaussian Kernel Su-pport Vector Machines in Random Fourier Feature Space. Journal of Computer Research and Development, 2016, 53(9): 1971-1978.) [14] RAHIMI A, RECHT B.Random Features for Large-Scale Kernel Machines[C/OL]. [2017-03-21]. https://people.eecs.berkeley.edu/~brecht/papers/07.rah.rec.nips.pdf. [15] RAHIMI A, RECHT B.Weighted Sums of Random Kitchen Sinks: Replacing Minimization with Randomization in Learning[C/OL].[2017-03-21]. https://people.eecs.berkeley.edu/~brecht/papers/08.rah.rec.nips.pdf. [16] RON D. Property Testing: A Learning Theory Perspective[C/OL]. [2017-03-21]. http://www.eng.tau.ac.il/~danar/Public-pdf/fnt-ml.pdf. [17] RON D. Algorithmic and Analysis Techniques in Property Testing. Foundations and Trends in Theoretical Computer Science, 2010, 5(2): 73-205. [18] BLUM M, LUBY M, RUBINFELD R. Self-testing/Correcting with Applications to Numerical Problems. Journal of Computer and System Sciences, 1993, 47(3): 549-595. [19] MATULEF K, O′DONNELL R, RUBINFELD R, et al.Testing Halfspaces. SIAM Journal on Computing, 2010, 39(5): 2004-2047. |
|
|
|