[1] SALINAS D, FLUNKERT V, GASTHAUS J, et al. DeepAR: Probabilistic forecasting with autoregressive recurrent networks[J]. International Journal of Forecasting, 2020, 36(3): 1181 1191.
[2] TRIEBE O, HEWAMALAGE H, PILYUGINA P, et al. NeuralProphet: Explainable Forecast ing at Scale[A]. 2021. arXiv: 2111.15397.
[3] LIM B, ARIK S Ö, LOEFF N, et al. Temporal fusion transformers for interpretable multi horizon time series forecasting[J]. International Journal of Forecasting, 2021, 37(4): 1748 1764.
[4] XIAO H, SUN H, RAN B, et al. Fuzzyneural network traffic prediction framework with wavelet decomposition[J]. Transportation research record, 2003, 1836(1): 1620.
[5] JEONG Y S, BYON Y J, CASTRONETO M M, et al. Supervised weightingonline learning algorithm for shortterm traffic flow prediction[J]. IEEE Transactions on Intelligent Transporta tion Systems, 2013, 14(4): 17001707.
[6] SUN Y, LENG B, GUAN W. A novel waveletSVM shorttime passenger flow prediction in Beijing subway system[J]. Neurocomputing, 2015, 166: 109121.
[7] BILLAH B, KING M L, SNYDER R D, et al. Exponential smoothing model selection for forecasting[J]. International journal of forecasting, 2006, 22(2): 239247.
[8] ZHANG G P. Time series forecasting using a hybrid ARIMA and neural network model[J]. Neurocomputing, 2003, 50: 159175.
[9] ZIVOT E, WANG J. Vector autoregressive models for multivariate time series[J]. Modeling financial time series with SPLUS®, 2006: 385429.
[10] LECUN Y, BENGIO Y, HINTON G. Deep learning[J]. nature, 2015, 521(7553): 436444.
[11] BAI S, KOLTER J Z, KOLTUN V. An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling[A]. 2018. arXiv: 1803.01271.
[12] HOCHREITER S, SCHMIDHUBER J. Long shortterm memory[J]. Neural computation, 1997, 9(8): 17351780.
[13] LAI G, CHANG W C, YANG Y, et al. Modeling longand shortterm temporal patterns with deep neural networks[C]//The 41st international ACM SIGIR conference on research & devel opment in information retrieval. 2018: 95104.
[14] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[J]. Advances in neural information processing systems, 2017, 30.
[15] LI S, JIN X, XUAN Y, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[J]. Advances in neural information processing systems, 2019, 32.
[16] ZHOU H, ZHANG S, PENG J, et al. Informer: Beyond efficient transformer for long sequence timeseries forecasting[C]//Proceedings of the AAAI conference on artificial intelligence: vol ume 35. 2021: 1110611115.
[17] SUTSKEVER I, VINYALS O, LE Q V. Sequence to sequence learning with neural networks [J]. Advances in neural information processing systems, 2014, 27.
[18] WU H, XU J, WANG J, et al. Autoformer: Decomposition transformers with autocorrelation for longterm series forecasting[J]. Advances in Neural Information Processing Systems, 2021, 34: 2241922430.
[19] JIANG W, LUO J. Graph neural network for traffic forecasting: A survey[J]. Expert Systems with Applications, 2022: 117921.
[20] DIAO Z, WANG X, ZHANG D, et al. Dynamic spatialtemporal graph convolutional neural networks for traffic forecasting[C]//Proceedings of the AAAI conference on artificial intelli gence: volume 33. 2019: 890897.
[21] ZHANG Q, CHANG J, MENG G, et al. Spatiotemporal graph structure learning for traf fic forecasting[C]//Proceedings of the AAAI conference on artificial intelligence: volume 34. 2020: 11771185.
[22] WU Z, PAN S, LONG G, et al. Connecting the dots: Multivariate time series forecasting with graph neural networks[C]//Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining. 2020: 753763.
[23] CHEN W, CHEN L, XIE Y, et al. Multirange attentive bicomponent graph convolutional net work for traffic forecasting[C]//Proceedings of the AAAI conference on artificial intelligence: volume 34. 2020: 35293536.
[24] RAO X, WANG H, ZHANG L, et al. Fogs: Firstorder gradient supervision with learning based graph for traffic flow forecasting[C]//Proceedings of International Joint Conference on Artificial Intelligence, IJCAI. 2022.
[25] LAN S, MA Y, HUANG W, et al. Dstagnn: Dynamic spatialtemporal aware graph neural network for traffic flow forecasting[C]//International Conference on Machine Learning. PMLR, 2022: 1190611917.
[26] YU B, YIN H, ZHU Z. Spatiotemporal graph convolutional networks: a deep learning frame work for traffic forecasting[C]//Proceedings of the 27th International Joint Conference on Ar tificial Intelligence. 2018: 36343640.
[27] LI Y, YU R, SHAHABI C, et al. Diffusion Convolutional Recurrent Neural Network: Data Driven Traffic Forecasting[C]//International Conference on Learning Representations. 2018.
[28] WU Z, PAN S, LONG G, et al. Graph wavenet for deep spatialtemporal graph modeling[C]// Proceedings of the 28th International Joint Conference on Artificial Intelligence. 2019: 1907 1913.
[29] BAI L, YAO L, LI C, et al. Adaptive graph convolutional recurrent network for traffic fore casting[J]. Advances in neural information processing systems, 2020, 33: 1780417815.
[30] LIU S, YU H, LIAO C, et al. Pyraformer: Lowcomplexity pyramidal attention for longrangetime series modeling and forecasting[C]//International conference on learning representations. 2021.
[31] ZHOU T, MA Z, WEN Q, et al. Fedformer: Frequency enhanced decomposed transformer for longterm series forecasting[C]//International Conference on Machine Learning. PMLR, 2022: 2726827286.
[32] WEN Q, SUN L, SONG X, et al. Time Series Data Augmentation for Deep Learning: A Survey [C]//International Joint Conference on Artificial Intelligence. 2020.
[33] POPEL M, BOJAR O. Training Tips for the Transformer Model[J]. The Prague Bulletin of Mathematical Linguistics, 2018, 110: 43 70.
[34] CHUNG J, GULCEHRE C, CHO K, et al. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling[A]. 2014. arXiv: 1412.3555.
[35] JADERBERG M, SIMONYAN K, ZISSERMAN A, et al. Spatial transformer networks[J]. Advances in neural information processing systems, 2015, 28.
[36] BAHDANAU D, CHO K H, BENGIO Y. Neural machine translation by jointly learning to align and translate[C]//3rd International Conference on Learning Representations, ICLR 2015. 2015.
[37] KENTON J D M W C, TOUTANOVA L K. BERT: Pretraining of Deep Bidirectional Trans formers for Language Understanding[C]//Proceedings of NAACLHLT. 2019: 41714186.
[38] DOSOVITSKIY A, BEYER L, KOLESNIKOV A, et al. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale[C]//International Conference on Learning Repre sentations. 2021.
[39] ARNAB A, DEHGHANI M, HEIGOLD G, et al. Vivit: A video vision transformer[C]// Proceedings of the IEEE/CVF international conference on computer vision. 2021: 68366846.
[40] GULATI A, QIN J, CHIU C C, et al. Conformer: Convolutionaugmented Transformer for Speech Recognition[A]. 2020. arXiv: 2005.08100.
[41] LAMB A M, ALIAS PARTH GOYAL A G, ZHANG Y, et al. Professor forcing: A new algo rithm for training recurrent networks[J]. Advances in neural information processing systems, 2016, 29.
[42] KIPF T N, WELLING M. SemiSupervised Classification with Graph Convolutional Networks [C]//International Conference on Learning Representations. 2017.
[43] GILMER J, SCHOENHOLZ S S, RILEY P F, et al. Neural message passing for quantum chemistry[C]//International conference on machine learning. PMLR, 2017: 12631272.
[44] DEFFERRARD M, BRESSON X, VANDERGHEYNST P. Convolutional neural networks on graphs with fast localized spectral filtering[J]. Advances in neural information processing systems, 2016, 29.
[45] VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph Attention Networks[C]// International Conference on Learning Representations. 2018.
[46] ZHU Y, XU W, ZHANG J, et al. A Survey on Graph Structure Learning: Progress and Oppor tunities[A]. 2022. arXiv: 2103.03036.
[47] LI R, WANG S, ZHU F, et al. Adaptive graph convolutional neural networks[C]//Proceedings of the AAAI conference on artificial intelligence: volume 32. 2018.
[48] ZHU Y, XU Y, YU F, et al. CAGNN: ClusterAware Graph Neural Networks for Unsupervised Graph Representation Learning[A]. 2020. arXiv: 2009.01674.
[49] CHEN Y, WU L, ZAKI M. Iterative deep graph learning for graph neural networks: Better and robust node embeddings[J]. Advances in neural information processing systems, 2020, 33: 1931419326.
[50] 张昕, 曾鹏, 张瑞, 等. 交通大数据的特征及价值[J]. 软件导刊, 2016, 15(3): 3.
[51] 李德仁. 论时空大数据的智能处理与服务[J]. 地球信息科学学报, 2019, 21(12): 7.
[52] KIM T, KIM J, TAE Y, et al. Reversible instance normalization for accurate timeseries forecasting against distribution shift[C]//International Conference on Learning Representations. 2021.
[53] JANG E, GU S, POOLE B. Categorical Reparameterization with GumbelSoftmax[C]// International Conference on Learning Representations. 2017.
[54] ABUELHAIJA S, PEROZZI B, KAPOOR A, et al. Mixhop: Higherorder graph convo lutional architectures via sparsified neighborhood mixing[C]//international conference on ma chine learning. PMLR, 2019: 2129.
[55] KINGMA D P, BA J. Adam: A Method for Stochastic Optimization[C]//BENGIO Y, LECUN Y. 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 79, 2015, Conference Track Proceedings. 2015.
[56] GUO S, LIN Y, FENG N, et al. Attention based spatialtemporal graph convolutional networks for traffic flow forecasting[C]//Proceedings of the AAAI conference on artificial intelligence: volume 33. 2019: 922929.
[57] SONG C, LIN Y, GUO S, et al. Spatialtemporal synchronous graph convolutional networks: A new framework for spatialtemporal network data forecasting[C]//Proceedings of the AAAI conference on artificial intelligence: volume 34. 2020: 914921.
[58] LI M, ZHU Z. Spatialtemporal fusion graph neural networks for traffic flow forecasting[C]// Proceedings of the AAAI conference on artificial intelligence: volume 35. 2021: 41894196.
[59] WANG D, CUI P, ZHU W. Structural deep network embedding[C]//Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining. 2016: 1225 1234.
修改评论