Loading [MathJax]/extensions/MathMenu.js
Gegenbauer Graph Neural Networks for Time-Varying Signal Reconstruction | IEEE Journals & Magazine | IEEE Xplore

Gegenbauer Graph Neural Networks for Time-Varying Signal Reconstruction


Abstract:

Reconstructing time-varying graph signals (or graph time-series imputation) is a critical problem in machine learning and signal processing with broad applications, rangi...Show More

Abstract:

Reconstructing time-varying graph signals (or graph time-series imputation) is a critical problem in machine learning and signal processing with broad applications, ranging from missing data imputation in sensor networks to time-series forecasting. Accurately capturing the spatio-temporal information inherent in these signals is crucial for effectively addressing these tasks. However, existing approaches relying on smoothness assumptions of temporal differences and simple convex optimization techniques that have inherent limitations. To address these challenges, we propose a novel approach that incorporates a learning module to enhance the accuracy of the downstream task. To this end, we introduce the Gegenbauer-based graph convolutional (GegenConv) operator, which is a generalization of the conventional Chebyshev graph convolution by leveraging the theory of Gegenbauer polynomials. By deviating from traditional convex problems, we expand the complexity of the model and offer a more accurate solution for recovering time-varying graph signals. Building upon GegenConv, we design the Gegenbauer-based time graph neural network (GegenGNN) architecture, which adopts an encoder–decoder structure. Likewise, our approach also uses a dedicated loss function that incorporates a mean squared error (MSE) component alongside Sobolev smoothness regularization. This combination enables GegenGNN to capture both the fidelity to ground truth and the underlying smoothness properties of the signals, enhancing the reconstruction performance. We conduct extensive experiments on real datasets to evaluate the effectiveness of our proposed approach. The experimental results demonstrate that GegenGNN outperforms state-of-the-art methods, showcasing its superior capability in recovering time-varying graph signals.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 35, Issue: 9, September 2024)
Page(s): 11734 - 11745
Date of Publication: 10 April 2024

ISSN Information:

PubMed ID: 38598390

Funding Agency:

Citations are not available for this document.

I. Introduction

The accumulation of complex unstructured data has experienced a tremendous surge due to the noteworthy advancements in information technology. Undertaking the task of representing and analyzing such data can present a formidable challenge. Nevertheless, graph signal processing (GSP) and graph neural networks (GNNs) have emerged as promising areas of research that have demonstrated remarkable potential for unstructured data in recent years [1], [2], [3], [4]. GSP and GNNs adopt a data modeling approach wherein data are represented as signals or vectors residing on a collection of graph nodes. This framework encompasses the incorporation of both feature information and the inherent relational structure of the data. This approach offers novel insights into data manipulation, effectively bridging the domains of machine learning and signal processing [5] and has profound implications across diverse fields, including semi-supervised learning [3], node classification, link prediction, graph classification [6], [7], [8], [9], clustering [10], computer vision [11], [12], [13], recommendations in social networks [14], [15], influence propagation [16] and misinformation detection [17], materials modeling [18], and drug discovery [19], among others.

Cites in Papers - |

Cites in Papers - IEEE (1)

Select All
1.
Eisuke Yamagata, Kazuki Naganuma, Shunsuke Ono, "Robust Time-Varying Graph Signal Recovery for Dynamic Physical Sensor Network Data", IEEE Transactions on Signal and Information Processing over Networks, vol.11, pp.59-70, 2025.
Contact IEEE to Subscribe

References

References is not available for this document.