[1] GONON L, ORTEGA J P. Reservoir computing universality with stochastic inputs[J]. IEEEtransactions on neural networks and learning systems, 2019, 31(1): 100-112.
[2] HART A G, HOOK J L, DAWES J H. Echo State Networks trained by Tikhonov least squares areL2(μ) approximators of ergodic dynamical systems[J/OL]. Physica D: Nonlinear Phenomena,2021, 421: 132882. https://www.sciencedirect.com/science/article/pii/S0167278921000403.DOI: https://doi.org/10.1016/j.physd.2021.132882.
[3] BOLLT E. On explaining the surprising success of reservoir computing forecaster of chaos?The universal machine learning dynamical system with contrast to VAR and DMD[J]. Chaos:An Interdisciplinary Journal of Nonlinear Science, 2021, 31(1): 013108.
[4] GAUTHIER D J, BOLLT E, GRIFFITH A, et al. Next generation reservoir computing[J]. Naturecommunications, 2021, 12(1): 5564.
[5] YUAN J D, WANG Z H. Review of time series representation and classification techniques[J].Computer Science, 2015, 42(3): 1-7.
[6] HE Y L, XU Q K. Review of Time Series Prediction Technology[J]. INFORMATION andCOMMUNICATIONS, 2018, 11.
[7] GU Y. VII. On a method of investigating periodicities disturbed series, with special reference toWolfer’s sunspot numbers[J]. Philosophical Transactions of the Royal Society of London. SeriesA, Containing Papers of a Mathematical or Physical Character, 1927, 226(636-646): 267-298.
[8] BOX G E, JENKINS G M, REINSEL G C, et al. Time series analysis: forecasting and control[M]. John Wiley & Sons, 2015.
[9] LEE N, CHOI H, KIM S H. Bayes shrinkage estimation for high-dimensional VAR models withscale mixture of normal distributions for noise[J]. Computational Statistics & Data Analysis,2016, 101: 250-276.
[10] ATHANASOPOULOS G, POSKITT D S, VAHID F, et al. Determination of Long-run andShort-run Dynamics in EC-VARMA Models via Canonical Correlations[J]. Journal of AppliedEconometrics, 2016, 31(6): 1100-1119.
[11] BERARDENGO M, ROSSI G B, CRENNA F. Sea Spectral Estimation Using ARMA Models[J]. Sensors, 2021, 21(13): 4280.
[12] RAO T S, GABR M M. An introduction to bispectral analysis and bilinear time series models:volume 24[M]. Springer Science & Business Media, 2012.
[13] XIANG Y. Using ARIMA-GARCH Model to Analyze Fluctuation Law of International OilPrice[J]. Mathematical Problems in Engineering, 2022, 2022: 1-7.
[14] VAPNIK V. The nature of statistical learning theory[M]. Springer science & business media,1999.
[15] SUN W, XU C. Carbon price prediction based on modified wavelet least square support vectormachine[J]. Science of the Total Environment, 2021, 754: 142052.
[16] DING M, ZHOU H, XIE H, et al. A time series model based on hybrid-kernel least-squaressupport vector machine for short-term wind power forecasting[J]. ISA transactions, 2021, 108:58-68.
[17] LIANG F. Bayesian neural networks for nonlinear time series forecasting[J]. Statistics andcomputing, 2005, 15: 13-29.
[18] CHEN X, SUN L. Bayesian temporal factorization for multidimensional time series prediction[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, 44(9): 4659-4673.
[19] LAPEDES A, FARBER R. How neural nets work[C]//Neural information processing systems.1987.
[20] HUANG G B, ZHU Q Y, SIEW C K. Extreme learning machine: theory and applications[J].Neurocomputing, 2006, 70(1-3): 489-501.
[21] MIRIKITANI D T, NIKOLAEV N. Recursive bayesian recurrent neural networks for time-seriesmodeling[J]. IEEE Transactions on Neural Networks, 2009, 21(2): 262-274.
[22] ARDALANI-FARSA M, ZOLFAGHARI S. Chaotic time series prediction with residual analysis method using hybrid Elman–NARX neural networks[J]. Neurocomputing, 2010, 73(13-15):2540-2553.
[23] JAEGER H, HAAS H. Harnessing nonlinearity: Predicting chaotic systems and saving energyin wireless communication[J]. Science, 2004, 304(5667): 78-80.
[24] JAEGER H. The “echo state” approach to analysing and training recurrent neural networkswith an erratum note[J]. Bonn, Germany: German National Research Center for InformationTechnology GMD Technical Report, 2001, 148(34): 13.
[25] MAASS W, NATSCHLÄGER T, MARKRAM H. Real-time computing without stable states:A new framework for neural computation based on perturbations[J]. Neural computation, 2002,14(11): 2531-2560.
[26] SCHRAUWEN B, VERSTRAETEN D, VAN CAMPENHOUT J. An overview of reservoircomputing: theory, applications and implementations[C]//The European Symposium on Artificial Neural Networks. 2007: 471-482.
[27] GALLICCHIO C, MICHELI A, PEDRELLI L. Deep reservoir computing: A critical experimental analysis[J]. Neurocomputing, 2017, 268: 87-99.
[28] GALLICCHIO C, MICHELI A, PEDRELLI L. Design of deep echo state networks[J]. NeuralNetworks, 2018, 108: 33-47.
[29] AKIYAMA T, TANAKA G. Computational efficiency of multi-step learning echo state networks for nonlinear time series prediction[J]. IEEE Access, 2022, 10: 28535-28544.
[30] GOUDARZI A, STEFANOVIC D. Towards a calculus of echo state networks[J]. ProcediaComputer Science, 2014, 41: 176-181.
[31] LIU W J, BAI Y T, JIN X B, et al. Adaptive Broad Echo State Network for Nonstationary TimeSeries Forecasting[J]. Mathematics, 2022, 10(17): 3188.
[32] CHEN M, SAAD W, YIN C. Liquid state machine learning for resource and cache management in LTE-U unmanned aerial vehicle (UAV) networks[J]. IEEE Transactions on WirelessCommunications, 2019, 18(3): 1504-1517.
[33] TANG C, JI J, LIN Q, et al. Evolutionary neural architecture design of liquid state machinefor image classification[C]//ICASSP 2022-2022 IEEE International Conference on Acoustics,Speech and Signal Processing (ICASSP). IEEE, 2022: 91-95.
[34] DECKERS L, TSANG I J, LEEKWIJCK W V, et al. Extended liquid state machines for speechrecognition[J]. Frontiers in neuroscience.-Lausanne, 2022, 16: 1-14.
[35] SHI Z, HAN M. Support vector echo-state machine for chaotic time-series prediction[J]. IEEEtransactions on neural networks, 2007, 18(2): 359-372.
[36] SCHRAUWEN B, STROOBANDT D, et al. Using reservoir computing in a decompositionapproach for time series prediction[C]//ESTSP 2008 European Symposium on Time Series Prediction. Multiprint Oy/Otamedia, 2008: 149-158.
[37] HAN M, WANG Y N. Prediction of multivariate time series based on reservoir principal component analysis[J]. Control and Decision, 2009, 24(10): 1526-1530.
[38] XU M L, HAN M, LIN H F. Wavelet-denoising multiple echo state networks for multivariatetime series prediction[J]. Information Sciences, 2018, 465: 439-458.
[39] FERREIRA A A, LUDERMIR T B. Evolutionary strategy for simultaneous optimization ofparameters, topology and reservoir weights in Echo State Networks[C]//The 2010 InternationalJoint Conference on Neural Networks (IJCNN). IEEE, 2010: 1-7.
[40] RABIN M J A, HOSSAIN M S, AHSAN M S, et al. Sensitivity learning oriented nonmonotonic multi reservoir echo state network for short-term load forecasting[C]//2013 InternationalConference on Informatics, Electronics and Vision (ICIEV). IEEE, 2013: 1-6.
[41] CHOUIKHI N, AMMAR B, ROKBANI N, et al. PSO-based analysis of Echo State Networkparameters for time series forecasting[J]. Applied Soft Computing, 2017, 55: 211-225.
[42] THIEDE L A, PARLITZ U. Gradient based hyperparameter optimization in echo state networks[J]. Neural Networks, 2019, 115: 23-29.
[43] ÖZTÜRK M M, CANKAYA I A, IPEKCI D. Optimizing echo state network through a novelfisher maximization based stochastic gradient descent[J]. Neurocomputing, 2020, 415: 215-224.
[44] JAEGER H. Discovering multiscale dynamical features with hierarchical echo state networks[M]. Deutsche Nationalbibliothek, 2007.
[45] RODAN A, TINO P. Minimum complexity echo state network[J]. IEEE transactions on neuralnetworks, 2010, 22(1): 131-144.
[46] CUI H, FENG C, CHAI Y, et al. Effect of hybrid circle reservoir injected with wavelet-neuronson performance of echo state network[J]. Neural Networks, 2014, 57: 141-151.
[47] GALLICCHIO C, MICHELI A. Architectural and markovian factors of echo state networks[J].Neural Networks, 2011, 24(5): 440-456.
[48] MA Q L, CHEN W B. Modular state space of echo state network[J]. Neurocomputing, 2013,122: 406-417.
[49] LUN S X, LIN J, YAO X S. Time series prediction with an improved echo state network usingsmall world network[J]. Acta Automatica Sinica, 2015, 41(9): 1669-1679.
[50] XUE Y, ZHANG Q, SLOWIK A. Automatic topology optimization of echo state network basedon particle swarm optimization[J]. Engineering Applications of Artificial Intelligence, 2023,117: 105574.
[51] DUTOIT X, SCHRAUWEN B, VAN CAMPENHOUT J, et al. Pruning and regularization inreservoir computing[J]. Neurocomputing, 2009, 72(7-9): 1534-1546.
[52] HAN M, WANG Y N. Multivariate time series online predictor with Kalman filter trainedreservoir[J]. Acta Automatica Sinica, 2010, 36(1): 169-173.
[53] SONG Q S, FENG Z R, LI R H. Stable training method for output connection weights of echostate networks[J]. Control and Decision, 2011, 26(1): 22-26.
[54] CHATZIS S P, DEMIRIS Y. Echo state Gaussian process[J]. IEEE Transactions on NeuralNetworks, 2011, 22(9): 1435-1445.
[55] SHAHI S, FENTON F H, CHERRY E M. Prediction of chaotic time series using recurrentneural networks and reservoir computing techniques: A comparative study[J]. Machine learningwith applications, 2022, 8: 100300.
[56] MACKEY M C, GLASS L. Oscillation and chaos in physiological control systems[J]. Science,1977, 197(4300): 287-289.
[57] LORENZ E N. Deterministic nonperiodic flow[J]. Journal of atmospheric sciences, 1963, 20(2): 130-141.
[58] LUKOŠEVIČIUS M. A practical guide to applying echo state networks[J]. Neural Networks:Tricks of the Trade: Second Edition, 2012: 659-686.
[59] VERSTRAETEN D, DAMBRE J, DUTOIT X, et al. Memory versus non-linearity in reservoirs[C]//The 2010 international joint conference on neural networks (IJCNN). IEEE, 2010: 1-8.
修改评论