CLC number: TP181
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 2011-03-04
Cited: 0
Clicked: 7884
Peng Chen, Yong-zai Lu. Extremal optimization for optimizing kernel function and its parameters in support vector regression[J]. Journal of Zhejiang University Science C, 2011, 12(4): 297-306.
@article{title="Extremal optimization for optimizing kernel function and its parameters in support vector regression",
author="Peng Chen, Yong-zai Lu",
journal="Journal of Zhejiang University Science C",
volume="12",
number="4",
pages="297-306",
year="2011",
publisher="Zhejiang University Press & Springer",
doi="10.1631/jzus.C1000110"
}
%0 Journal Article
%T Extremal optimization for optimizing kernel function and its parameters in support vector regression
%A Peng Chen
%A Yong-zai Lu
%J Journal of Zhejiang University SCIENCE C
%V 12
%N 4
%P 297-306
%@ 1869-1951
%D 2011
%I Zhejiang University Press & Springer
%DOI 10.1631/jzus.C1000110
TY - JOUR
T1 - Extremal optimization for optimizing kernel function and its parameters in support vector regression
A1 - Peng Chen
A1 - Yong-zai Lu
J0 - Journal of Zhejiang University Science C
VL - 12
IS - 4
SP - 297
EP - 306
%@ 1869-1951
Y1 - 2011
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/jzus.C1000110
Abstract: The performance of the support vector regression (SVR) model is sensitive to the kernel type and its parameters. The determination of an appropriate kernel type and the associated parameters for SVR is a challenging research topic in the field of support vector learning. In this study, we present a novel method for simultaneous optimization of the SVR kernel function and its parameters, formulated as a mixed integer optimization problem and solved using the recently proposed heuristic ‘extremal optimization (EO)’. We present the problem formulation for the optimization of the SVR kernel and parameters, the EO-SVR algorithm, and experimental tests with five benchmark regression problems. The results of comparison with other traditional approaches show that the proposed EO-SVR method provides better generalization performance by successfully identifying the optimal SVR kernel function and its parameters.
[1]Ali, S., Smith, K.A., 2006. A meta-learning approach to automatic kernel selection for support vector machines. Neurocomputing, 70(1-3):173-186.
[2]Avci, E., 2009. Selecting of the optimal feature subset and kernel parameters in digital modulation classification by using hybrid genetic algorithm—support vector machines: HGASVM. Expert Syst. Appl., 36(2):1391-1402.
[3]Bak, P., Sneppen, K., 1993. Punctuated equilibrium and criticality in a simple model of evolution. Phys. Rev. Lett., 71(24):4083-4086.
[4]Bak, P., Tang, C., Wiesenfeld, K., 1987. Self-organized criticality: an explanation of the 1/f noise. Phys. Rev. Lett., 59(4):381-384.
[5]Boettcher, S., Percus, A.G., 1999. Extremal Optimization: Methods Derived from Co-evolution. Proc. Genetic and Evolutionary Computation Conf., p.825-832.
[6]Boettcher, S., Percus, A., 2000. Nature’s way of optimizing. Artif. Intell., 119(1-2):275-286.
[7]Burges, C.J.C., 1998. A tutorial on support vector machines for pattern recognition. Data Min. Knowl. Discov., 2(2):121-167.
[8]Chen, M.R., Lu, Y.Z., 2008. A novel elitist multiobjective optimization algorithm: multiobjective extremal optimization. Eur. J. Oper. Res., 188(3):637-651.
[9]Chen, M.R., Lu, Y.Z., Yang, G.K., 2007. Multiobjective extremal optimization with applications to engineering design. J. Zhejiang Univ.-Sci. A, 8(12):1905-1911.
[10]Chen, Y.W., Lu, Y.Z., Chen, P., 2007. Optimization with extremal dynamics for the traveling salesman problem. Phys. A, 385(1):115-123.
[11]Chuang, C.C., Lee, Z.J., 2011. Hybrid robust support vector machines for regression with outliers. Appl. Soft. Comput., 11(1):64-72.
[12]Engelbrecht, A.P., 2007. Computational Intelligence: an Introduction (2nd Ed.). John Wiley & Sons, New York.
[13]Friedrichs, F., Igel, C., 2005. Evolutionary tuning of multiple SVM parameters. Neurocomputing, 64(1-4):107-117.
[14]Hou, S.M., Li, Y.R., 2009. Short-term fault prediction based on support vector machines with parameter optimization by evolution strategy. Expert Syst. Appl., 36(10):12383-12391.
[15]Howley, T., Madden, M.G., 2005. The genetic kernel support vector machine: description and evaluation. Artif. Intell. Rev., 24(3-4):379-395.
[16]Hsu, C.W., Chang, C.C., Lin, C.J., 2004. A Practical Guide to Support Vector Classification. Technical Report, Department of Computer Science and Information Engineering, National Taiwan University.
[17]Jeng, J.T., 2006. Hybrid approach of selecting hyperparameters of support vector machine for regression. IEEE Trans. Syst. Man Cybern. B, 36(3):699-709.
[18]Lorena, A.C., de Carvalho, A., 2008. Evolutionary tuning of SVM parameter values in multiclass problems. Neurocomputing, 71(16-18):3326-3334.
[19]Lu, Y.Z., Chen, M.R., Chen, Y.W., 2007. Studies on Extremal Optimization and Its Applications in Solving Real World Optimization Problems. IEEE Symp. on Foundations of Computational Intelligence, p.162-168.
[20]Mao, Y., Zhou, X., Pi, D., Sun, Y., Wong, S.T.C., 2005. Parameters selection in gene selection using Gaussian kernel support vector machines by genetic algorithm. J. Zhejiang Univ.-Sci., 6B(10):961-973.
[21]Min, J.H., Lee, Y.C., 2005. Bankruptcy prediction using support vector machine with optimal choice of kernel function parameters. Expert Syst. Appl., 28(4):603-614.
[22]Pai, P.F., Hong, W.C., 2005. Support vector machines with simulated annealing algorithms in electricity load forecasting. Energy Conv. Manag., 46(17):2669-2688.
[23]Qiao, J.F., Wang, H.D., 2008. A self-organizing fuzzy neural network and its applications to function approximation and forecast modeling. Neurocomputing, 71(4-6):564-569.
[24]Saini, L.M., Aggarwal, S.K., Kumar, A., 2010. Parameter optimisation using genetic algorithm for support vector machine-based price-forecasting model in National electricity market. IET Gener. Transm. Distr., 4(1):36-49.
[25]Steve, G., 1998. Support Vector Machines Classification and Regression. ISIS Technical Report, Image, Speech, & Intelligent Systems Group, University of Southampton, UK.
[26]Tang, X., Zhuang, L., Jiang, C., 2009. Prediction of silicon content in hot metal using support vector regression based on chaos particle swarm optimization. Expert Syst. Appl., 36(9):11853-11857.
[27]Thadani, K., Ashutosh, Jayaraman, V.K., Sundararajan, V., 2006. Evolutionary Selection of Kernels in Support Vector Machines. Int. Conf. on Advanced Computing and Communications, p.19-24.
[28]Vapnik, V., 1995. The Nature of Statistical Learning Theory. Springer-Verlag, New York.
[29]Wu, C.H., Tzeng, G.H., Goo, Y.J., Fang, W.C., 2007. A real-valued genetic algorithm to optimize the parameters of support vector machine for predicting bankruptcy. Expert Syst. Appl., 32(2):397-408.
[30]Wu, C.H., Tzeng, G.H., Lin, R.H., 2009. A novel hybrid genetic algorithm for kernel function and parameter optimization in support vector regression. Expert Syst. Appl., 36(3):4725-4735.
[31]Wu, Q., 2010. A hybrid-forecasting model based on Gaussian support vector machine and chaotic particle swarm optimization. Expert Syst. Appl., 37(3):2388-2394.
[32]Zhang, L., Zhou, W., Jiao, L., 2004. Wavelet support vector machine. IEEE Trans. Syst. Man Cybern. B, 34(1):34-39.
[33]Zhang, X.L., Chen, X.F., He, Z.J., 2010. An ACO-based algorithm for parameter optimization of support vector machines. Expert Syst. Appl., 37(9):6618-6628.
Open peer comments: Debate/Discuss/Question/Opinion
<1>