CLC number: TP183
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 2012-07-06
Cited: 2
Clicked: 8755
Xiao-chuan Sun, Hong-yan Cui, Ren-ping Liu, Jian-ya Chen, Yun-jie Liu. Modeling deterministic echo state network with loop reservoir[J]. Journal of Zhejiang University Science C, 2012, 13(9): 689-701.
@article{title="Modeling deterministic echo state network with loop reservoir",
author="Xiao-chuan Sun, Hong-yan Cui, Ren-ping Liu, Jian-ya Chen, Yun-jie Liu",
journal="Journal of Zhejiang University Science C",
volume="13",
number="9",
pages="689-701",
year="2012",
publisher="Zhejiang University Press & Springer",
doi="10.1631/jzus.C1200069"
}
%0 Journal Article
%T Modeling deterministic echo state network with loop reservoir
%A Xiao-chuan Sun
%A Hong-yan Cui
%A Ren-ping Liu
%A Jian-ya Chen
%A Yun-jie Liu
%J Journal of Zhejiang University SCIENCE C
%V 13
%N 9
%P 689-701
%@ 1869-1951
%D 2012
%I Zhejiang University Press & Springer
%DOI 10.1631/jzus.C1200069
TY - JOUR
T1 - Modeling deterministic echo state network with loop reservoir
A1 - Xiao-chuan Sun
A1 - Hong-yan Cui
A1 - Ren-ping Liu
A1 - Jian-ya Chen
A1 - Yun-jie Liu
J0 - Journal of Zhejiang University Science C
VL - 13
IS - 9
SP - 689
EP - 701
%@ 1869-1951
Y1 - 2012
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/jzus.C1200069
Abstract: Echo state network (ESN), which efficiently models nonlinear dynamic systems, has been proposed as a special form of recurrent neural network. However, most of the proposed ESNs consist of complex reservoir structures, leading to excessive computational cost. Recently, minimum complexity ESNs were proposed and proved to exhibit high performance and low computational cost. In this paper, we propose a simple deterministic ESN with a loop reservoir, i.e., an ESN with an adjacent-feedback loop reservoir. The novel reservoir is constructed by introducing regular adjacent feedback based on the simplest loop reservoir. Only a single free parameter is tuned, which considerably simplifies the ESN construction. The combination of a simplified reservoir and fewer free parameters provides superior prediction performance. In the benchmark datasets and real-world tasks, our scheme obtains higher prediction accuracy with relatively low complexity, compared to the classic ESN and the minimum complexity ESN. Furthermore, we prove that all the linear ESNs with the simplest loop reservoir possess the same memory capacity, arbitrarily converging to the optimal value.
[1]Abbasi Nozari, H., Dehghan Banadaki, H., Mokhtare, M., Hekmati Vahed, S., 2012. Intelligent non-linear modelling of an industrial winding process using recurrent local linear neuro-fuzzy networks. J. Zhejiang Univ.-Sci. C (Comput. & Electron.), 13(6):403-412.
[2]Atiya, A.F., Parlos, A.G., 2000. New results on recurrent network training: unifying the algorithms and accelerating convergence. IEEE Trans. Neur. Networks, 11(3):697-709.
[3]Chatzis, S.P., Demiris, Y., 2011. Echo state Gaussian process. IEEE Trans. Neur. Networks, 22(9):1435-1445.
[4]Deng, Z.D., Zhang, Y., 2007. Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Trans. Neur. Networks, 18(5):1364-1375.
[5]Henon, M., 1976. A 2-D mapping with a strange attractor. Commun. Math. Phys., 50(1):69-77.
[6]Hochreiter, S., Bengio, Y., Frasconi, P., Schmidhuber, J., 2001. Gradient Flow in Recurrent Nets: the Difficulty of Learning Long-Term Dependencies. In: Kolen, J., Kremer, S. (Eds.), A Field Guide to Dynamical Recurrent Networks. Wiley-IEEE Press, New York, p.237-243.
[7]Holzmann, G., Hauser, H., 2010. Echo state networks with filter neurons and a delay&sum readout. Neur. Networks, 23(2):244-256.
[8]Ikeda, K., Daido, H., Akimoto, O., 1980. Optical turbulence: chaotic behavior of transmitted light from a ring cavity. Phys. Rev. Lett., 45(9):709-712.
[9]Jaeger, H., 2001. The ‘Echo State’ Approach to Analysing and Training Recurrent Neural Networks. Technical Report No. 148, German National Research Center for Information Technology, Bremen, Germany.
[10]Jaeger, H., 2002a. A Tutorial on Training Recurrent Neural Networks, Covering BPPT, RTRL, EKF and the ‘Echo State Network’ Approach. Technical Report No. GMD 159, German National Research Center for Information Technology, Sankt Augustin, Germany.
[11]Jaeger, H., 2002b. Short Term Memory in Echo State Networks. Technical Report No. GMD 152, German National Research Center for Information Technology, Sankt Augustin, Germany.
[12]Jaeger, H., 2002c. Adaptive Nonlinear System Identification with Echo State Network. Advances in Neural Information Processing Systems, 15:593-600.
[13]Jaeger, H., Hass, H., 2004. Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science, 304(5667):78-80.
[14]Lukosevicius, M., Jaeger, H., 2009. Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev., 3(3):127-149.
[15]Mandic, D.P., Chambers, J.A., 2001. Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability. Wiley, New York, p.31-47.
[16]National Geophysical Data Center, 2007. Sunspot Numbers. Available from http://www.ngdc.noaa.gov/stp/iono/sunspot.html [Accessed on Sept. 23, 2011].
[17]Ozturk, M.C., Principe, J.C., 2007. An associative memory readout for ESNs with applications to dynamical pattern recognition. Neur. Networks, 20(3):377-390.
[18]Rodan, A., Tino, P., 2011. Minimum complexity echo state network. IEEE Trans. Neur. Networks, 22(1):131-144.
[19]Salmen, M., Ploger, P.G., 2005. Echo State Networks Used for Motor Control. Proc. IEEE Int. Conf. on Robotics and Automation, p.1953-1958.
[20]Schwenker, F., Labib, A., 2009. Echo State Networks and Neural Network Ensembles to Predict Sunspots Activity. 17th European Symp. on Artificial Neural Networks, p.379-384.
[21]Shi, Z.W., Han, M., 2007. Support vector echo-state machine for chaotic time-series prediction. IEEE Trans. Neur. Networks, 18(2):359-372.
[22]Siegelmann, H.T., Sontag, E.D., 1991. Turing computability with neural nets. Appl. Math. Lett., 4(6):77-80.
[23]Steil, J.J., 2005. Memory in backpropagation-decorrelation O(N) efficient online recurrent learning. LNCS, 3697:649-654.
[24]Tino, P., Schittenkopf, C., Dorffner, G., 2001. Financial volatility trading using recurrent neural networks. IEEE Trans. Neur. Networks, 12(4):865-874.
[25]Wyffels, F., Schrauwen, B., Stroobandt, D., 2008. Stable output feedback in reservoir computing using ridge regression. LNCS, 5163:808-817.
[26]Xia, Y.L., Jelfs, B., van Hulle, M.M., Principe, J.C., Mandic, D.P., 2011. An augmented echo state network for nonlinear adaptive filtering of complex noncircular signals. IEEE Trans. Neur. Networks, 22(1):74-83.
[27]Xue, Y.B., Yang, L., Haykin, S., 2007. Decoupled echo state networks with lateral inhibition. Neur. Networks, 20(3):365-376.
[28]Zhang, B., Miller, D.J., Wang, Y., 2012. Nonlinear system modeling with random matrices: echo state networks revisited. IEEE Trans. Neur. Networks Learn. Syst., 23(1):175-182.
Open peer comments: Debate/Discuss/Question/Opinion
<1>