CLC number: TP18
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 2013-08-07
Cited: 7
Clicked: 7834
Hua-juan Huang, Shi-fei Ding, Zhong-zhi Shi. Primal least squares twin support vector regression[J]. Journal of Zhejiang University Science C, 2013, 14(9): 722-732.
@article{title="Primal least squares twin support vector regression",
author="Hua-juan Huang, Shi-fei Ding, Zhong-zhi Shi",
journal="Journal of Zhejiang University Science C",
volume="14",
number="9",
pages="722-732",
year="2013",
publisher="Zhejiang University Press & Springer",
doi="10.1631/jzus.CIIP1301"
}
%0 Journal Article
%T Primal least squares twin support vector regression
%A Hua-juan Huang
%A Shi-fei Ding
%A Zhong-zhi Shi
%J Journal of Zhejiang University SCIENCE C
%V 14
%N 9
%P 722-732
%@ 1869-1951
%D 2013
%I Zhejiang University Press & Springer
%DOI 10.1631/jzus.CIIP1301
TY - JOUR
T1 - Primal least squares twin support vector regression
A1 - Hua-juan Huang
A1 - Shi-fei Ding
A1 - Zhong-zhi Shi
J0 - Journal of Zhejiang University Science C
VL - 14
IS - 9
SP - 722
EP - 732
%@ 1869-1951
Y1 - 2013
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/jzus.CIIP1301
Abstract: The training algorithm of classical twin support vector regression (TSVR) can be attributed to the solution of a pair of quadratic programming problems (QPPs) with inequality constraints in the dual space. However, this solution is affected by time and memory constraints when dealing with large datasets. In this paper, we present a least squares version for TSVR in the primal space, termed primal least squares TSVR (PLSTSVR). By introducing the least squares method, the inequality constraints of TSVR are transformed into equality constraints. Furthermore, we attempt to directly solve the two QPPs with equality constraints in the primal space instead of the dual space; thus, we need only to solve two systems of linear equations instead of two QPPs. Experimental results on artificial and benchmark datasets show that PLSTSVR has comparable accuracy to TSVR but with considerably less computational time. We further investigate its validity in predicting the opening price of stock.
[1]Boser, B.E., Guyon, I.M., Vapnik, V.N., 1992. A Training Algorithm for Optimal Margin Classifiers. Proc. 5th Annual Workshop on Computational Learning Theory, p.144-152.
[2]Chen, Z.Y., Fan, Z.P., 2012. Distributed customer behavior prediction using multiplex data: a collaborative MK-SVM approach. Knowl.-Based Syst., 35:111-119.
[3]Cong, H.H., Yang, C.F., Pu, X.R., 2008. Efficient Speaker Recognition Based on Multi-class Twin Support Vector Machines and GMMs. IEEE Conf. on Robotics, Automation and Mechatronics, p.348-352.
[4]Ding, S.F., Qi, B.J., 2012. Research of granular support vector machine. Artif. Intell. Rev., 38(1):1-7.
[5]Ding, S.F., Su, C.Y., Yu, J.Z., 2011. An optimizing BP neural network algorithm based on genetic algorithm. Artif. Intell. Rev., 36(2):153-162.
[6]Huang, H.J., Ding, S.F., 2012. A novel granular support vector machine based on mixed kernel function. Int. J. Dig. Cont. Technol. Its Appl., 6(20):484-492.
[7]Jayadeva, Khemchandani, R., Chandra, S., 2007. Twin support vector machines for pattern classification. IEEE Trans. Pattern Anal. Mach. Intell., 29(5):905-910.
[8]Liu, Y., Yang, J., Li, L., Wu, W., 2012. Negative effects of sufficiently small initial weights on back-propagation neural networks. J. Zhejiang Univ.-Sci. C (Comput. & Electron.), 13(8):585-592.
[9]Mangasarian, O.L., Wild, E.W., 2006. Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Trans. Pattern Anal. Mach. Intell., 28(1):69-74.
[10]Moraes, R., Valiati, J.F., Neto, W.P.G., 2013. Document-level sentiment classification: an empirical comparison between SVM and ANN. Exp. Syst. Appl., 40(2):621-633.
[11]Osuna, E., Freund, R., Girosi, F., 1997. An Improved Training Algorithm for Support Vector Machines. Proc. IEEE Workshop on Neural Networks for Signal Processing, p.276-285.
[12]Pan, H., Zhu, Y.P., Xia, L.Z., 2013. Efficient and accurate face detection using heterogeneous feature descriptors and feature selection. Comput. Vis. Image Understand., 117(1):12-28.
[13]Peng, X.J., 2010a. Primal twin support vector regression and its sparse approximation. Neurocomputing, 73(16-18):2846-2858.
[14]Peng, X.J., 2010b. TSVR: an efficient twin support vector machine for regression. Neur. Networks, 23(3):365-372.
[15]Platt, J.C., 1999. Using Analytic QP and Sparseness to Speed Training of Support Vector Machines. In: Kearns, M., Solla, S., Cohn, D. (Eds.), Advances in Neural Information Processing Systems 11. MIT Press, Cambridge, MA, p.557-563.
[16]Suykens, J.A.K., van de Walle, J., 2001. Optimal control by least squares support vector machines. Neur. Networks, 14(1):23-35.
[17]Vapnik, V.N., 1995. The Nature of Statistical Learning Theory. Springer-Verlag, New York.
[18]Wu, J.X., 2012. Efficient HIK SVM learning for image classification. IEEE Trans. Image Process., 21(10):4442-4453.
[19]Xu, X.Z., Ding, S.F., Shi, Z.Z., Zhu, H., 2012. A novel optimizing method for RBF neural network based on rough set and AP clustering algorithm. J. Zhejiang Univ.-Sci. C (Comput. & Electron.), 13(2):131-138.
[20]Zhang, X.S., Gao, X.B., Wang, Y., 2009. Twin support vector machines for MCs detection. J. Electron. (China), 26(3):318-325.
[21]Zhong, P., Xu, Y.T., Zhao, Y.H., 2012. Training twin support vector regression via linear programming. Neur. Comput. Appl., 21(2):399-407.
Open peer comments: Debate/Discuss/Question/Opinion
<1>