Overview of the latent graph structure learning (L-GSL). (a) Key nodes chosen at random (depicted as gray circles) are used to measure the significance of a query node (s...
Abstract:
Multivariate time series is prevalent in many scientific and industrial domains. Modeling multivariate signals is challenging due to their long-range temporal dependencie...Show MoreMetadata
Abstract:
Multivariate time series is prevalent in many scientific and industrial domains. Modeling multivariate signals is challenging due to their long-range temporal dependencies and intricate interactions–both direct and indirect. To confront these complexities, we introduce a method of representing multivariate signals as nodes in a graph with edges indicating interdependency between them. Specifically, we leverage graph neural networks (GNN) and attention mechanisms to efficiently learn the underlying relationships within the time series data. Moreover, we suggest employing hierarchical signal decompositions running over the graphs to capture multiple spatial dependencies. The effectiveness of our proposed model is evaluated across various real-world benchmark datasets designed for long-term forecasting tasks. The results consistently showcase the superiority of our model, achieving an average 23% reduction in mean squared error (MSE) compared to existing models.
Overview of the latent graph structure learning (L-GSL). (a) Key nodes chosen at random (depicted as gray circles) are used to measure the significance of a query node (s...
Published in: IEEE Access ( Volume: 11)
Funding Agency:
References is not available for this document.
Select All
1.
F. Petropoulos et al., "Forecasting: Theory and practice", Int. J. Forecasting, vol. 38, pp. 705-871, Jul. 2022.
2.
R. Lam, A. Sanchez-Gonzalez, M. Willson, P. Wirnsberger, M. Fortunato, F. Alet, et al., "GraphCast: Learning skillful medium-range global weather forecasting", arXiv:2212.12794, 2022.
3.
A. Derrow-Pinion, J. She, D. Wong, O. Lange, T. Hester, L. Perez, et al., "ETA prediction with graph neural networks in Google maps", Proc. 30th ACM Int. Conf. Inf. Knowl. Manage., pp. 3767-3776, Oct. 2021.
4.
S. Rahmani, A. Baghbani, N. Bouguila and Z. Patterson, "Graph neural networks for intelligent transportation systems: A survey", IEEE Trans. Intell. Transp. Syst., vol. 24, no. 8, pp. 8846-8885, Aug. 2023.
5.
Z. Wu, S. Pan, G. Long, J. Jiang, X. Chang and C. Zhang, "Connecting the dots: Multivariate time series forecasting with graph neural networks", Proc. 26th ACM SIGKDD Int. Conf. Knowl. Discovery Data Mining, pp. 753-763, Aug. 2020.
6.
C. Shang, J. Chen and J. Bi, "Discrete graph structure learning for forecasting multiple time series", arXiv:2101.06861, 2021.
7.
P. Battaglia, R. Pascanu, M. Lai and D. J. Rezende, "Interaction networks for learning about objects relations and physics", Proc. Adv. Neural Inf. Process. Syst., pp. 4502-4510, 2016.
8.
T. Kipf, E. Fetaya, K.-C. Wang, M. Welling and R. Zemel, "Neural relational inference for interacting systems", Proc. Int. Conf. Mach. Learn., pp. 2688-2697, 2018.
9.
A. Sanchez-Gonzalez, J. Godwin, T. Pfaff, R. Ying, J. Leskovec and P. Battaglia, "Learning to simulate complex physics with graph networks", Proc. Int. Conf. Mach. Learn., pp. 8459-8468, 2020.
10.
A. Deng and B. Hooi, "Graph neural network-based anomaly detection in multivariate time series", Proc. AAAI Conf. Artif. Intell., vol. 35, no. 5, pp. 4027-4035, 2021.
11.
R. B. Cleveland, W. S. Cleveland, J. E. McRae and I. Terpenning, "STL: A seasonal-trend decomposition", J. Off. Statist., vol. 6, no. 1, pp. 3-73, 1990.
12.
B. N. Oreshkin, D. Carpov, N. Chapados and Y. Bengio, "N-BEATS: Neural basis expansion analysis for interpretable time series forecasting", Proc. Int. Conf. Learn. Represent., 2020.
13.
C. Challu, K. G. Olivares, B. N. Oreshkin, F. G. Ramirez, M. M. Canseco and A. Dubrawski, "NHITS: Neural hierarchical interpolation for time series forecasting", Proc. AAAI Conf. Artif. Intell., vol. 37, pp. 6989-6997, 2023.
14.
R. Yu, S. Zheng, A. Anandkumar and Y. Yue, "Long-term forecasting using higher order tensor RNNs", arXiv:1711.00073, 2017.
15.
Y. Qin, D. Song, H. Chen, W. Cheng, G. Jiang and G. W. Cottrell, "A dual-stage attention-based recurrent neural network for time series prediction", Proc. 26th Int. Joint Conf. Artif. Intell., Aug. 2017.
16.
R. Wen, K. Torkkola, B. Narayanaswamy and D. Madeka, "A multi-horizon quantile recurrent forecaster", arXiv:1711.11053, 2017.
17.
D. Salinas, V. Flunkert, J. Gasthaus and T. Januschowski, "DeepAR: Probabilistic forecasting with autoregressive recurrent networks", Int. J. Forecasting, vol. 36, no. 3, pp. 1181-1191, Jul. 2020.
18.
I. Sutskever, O. Vinyals and Q. V. Le, "Sequence to sequence learning with neural networks", Proc. Adv. Neural Inf. Process. Syst., vol. 27, 2014.
19.
I. Beltagy, M. E. Peters and A. Cohan, "LongFormer: The long-document transformer", arXiv:2004.05150, 2020.
20.
N. Kitaev, L. Kaiser and A. Levskaya, "Reformer: The efficient transformer", Proc. Int. Conf. Learn. Represent., 2020, [online] Available: https://openreview.net/forum?id=rkgNKkHtvB.
21.
H. Zhou, S. Zhang, J. Peng, S. Zhang, J. Li, H. Xiong, et al., "Informer: Beyond efficient transformer for long sequence time-series forecasting", Proc. AAAI Conf. Artif. Intell., vol. 35, no. 12, pp. 11106-11115, 2021.
22.
M. Chen, H. Peng, J. Fu and H. Ling, "AutoFormer: Searching transformers for visual recognition", Proc. IEEE/CVF Int. Conf. Comput. Vis. (ICCV), pp. 12250-12260, Oct. 2021.
23.
G. Woo, C. Liu, D. Sahoo, A. Kumar and S. Hoi, "ETSformer: Exponential smoothing transformers for time-series forecasting", arXiv:2202.01381, 2022.
24.
S. Yan, Y. Xiong and D. Lin, "Spatial temporal graph convolutional networks for skeleton-based action recognition", Proc. AAAI Conf. Artif. Intell., vol. 32, 2018.
25.
Z. Huang, X. Shen, X. Tian, H. Li, J. Huang and X.-S. Hua, "Spatio-temporal inception graph convolutional networks for skeleton-based action recognition", Proc. 28th ACM Int. Conf. Multimedia, pp. 2122-2130, Oct. 2020.
26.
Y. Li, R. Yu, C. Shahabi and Y. Liu, "Diffusion convolutional recurrent neural network: Data-driven traffic forecasting", arXiv:1707.01926, 2017.
27.
Y. Seo, M. Defferrard, P. Vandergheynst and X. Bresson, "Structured sequence modeling with graph convolutional recurrent networks", Proc. Int. Conf. Neural Inf. Process., pp. 362-373, 2018.
28.
L. Zhao, Y. Song, C. Zhang, Y. Liu, P. Wang, T. Lin, et al., "T-GCN: A temporal graph convolutional network for traffic prediction", IEEE Trans. Intell. Transp. Syst., vol. 21, no. 9, pp. 3848-3858, Sep. 2020.
29.
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, et al., "Attention is all you need", Proc. Adv. Neural Inf. Process. Syst., pp. 5998-6008, 2017.
30.
R. Child, S. Gray, A. Radford and I. Sutskever, "Generating long sequences with sparse transformers", arXiv:1904.10509, 2019.