Loading [MathJax]/extensions/MathZoom.js
Higher-Order Sparse Convolutions in Graph Neural Networks | IEEE Conference Publication | IEEE Xplore

Higher-Order Sparse Convolutions in Graph Neural Networks


Abstract:

Graph Neural Networks (GNNs) have been applied to many problems in computer sciences. Capturing higher-order relationships between nodes is crucial to increase the expres...Show More

Abstract:

Graph Neural Networks (GNNs) have been applied to many problems in computer sciences. Capturing higher-order relationships between nodes is crucial to increase the expressive power of GNNs. However, existing methods to capture these relationships could be infeasible for large-scale graphs. In this work, we introduce a new higher-order sparse convolution based on the Sobolev norm of graph signals. Our Sparse Sobolev GNN (S-SobGNN) computes a cascade of filters on each layer with increasing Hadamard powers to get a more diverse set of functions, and then a linear combination layer weights the embeddings of each filter. We evaluate S-SobGNN in several applications of semi-supervised learning. S-SobGNN shows competitive performance in all applications as compared to several state-of-the-art methods.
Date of Conference: 04-10 June 2023
Date Added to IEEE Xplore: 05 May 2023
ISBN Information:

ISSN Information:

Conference Location: Rhodes Island, Greece

1. INTRODUCTION

Graph representation learning and its applications have gained significant attention in recent years. Notably, Graph Neural Networks (GNNs) have been extensively studied [1]–[6]. GNNs extend the concepts of Convolutional Neural Networks (CNNs) [7] to non-Euclidean data modeled as graphs. GNNs have numerous applications like semi-supervised learning [2], graph clustering [8], point cloud semantic segmentation [9], misinformation detection [10], and protein modeling [11]. Similarly, other graph learning techniques have been recently applied to image and video processing applications [12],[13].

Contact IEEE to Subscribe

References

References is not available for this document.