Loading [MathJax]/extensions/MathMenu.js
Analyzing Heterogeneous Networks With Missing Attributes by Unsupervised Contrastive Learning | IEEE Journals & Magazine | IEEE Xplore

Analyzing Heterogeneous Networks With Missing Attributes by Unsupervised Contrastive Learning


Abstract:

Heterogeneous information networks (HINs) are potent models of complex systems. In practice, many nodes in an HIN have their attributes unspecified, resulting in signific...Show More

Abstract:

Heterogeneous information networks (HINs) are potent models of complex systems. In practice, many nodes in an HIN have their attributes unspecified, resulting in significant performance degradation for supervised and unsupervised representation learning. We developed an unsupervised heterogeneous graph contrastive learning approach for analyzing HINs with missing attributes (HGCA). HGCA adopts a contrastive learning strategy to unify attribute completion and representation learning in an unsupervised heterogeneous framework. To deal with a large number of missing attributes and the absence of labels in unsupervised scenarios, we proposed an augmented network to capture the semantic relations between nodes and attributes to achieve a fine-grained attribute completion. Extensive experiments on three large real-world HINs demonstrated the superiority of HGCA over several state-of-the-art methods. The results also showed that the complemented attributes by HGCA can improve the performance of existing HIN models.
Page(s): 4438 - 4450
Date of Publication: 02 March 2022

ISSN Information:

PubMed ID: 35235523

Funding Agency:

References is not available for this document.

I. Introduction

Many real-world systems, e.g., transportation networks, power grids, and social networks, are best viewed and formulated as networks. The overarching problem of mining and analyzing valuable information in networks has been actively studied for decades [1], [2]. As a more capable representation scheme using multiple types of nodes and edges, heterogeneous information networks (HINs) were recently introduced to model complex systems with various types of entities and relations [3]. Low-dimensional embedding techniques have also been adopted to derive compact representations of HINs and extract network-specific information, such as heterogeneous network structural properties, and node semantic relations [3]. Several methods have been developed for heterogeneous network embedding (HNE), including proximity-preserving methods, message-passing methods, and relation-learning methods [3]. Among these methods are the popular ones based on heterogeneous graph neural networks (HGNNs), which have been applied to, e.g., node classification [4], [5] and link prediction [6], [7].

Select All
1.
Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang and P. S. Yu, "A comprehensive survey on graph neural networks", IEEE Trans. Neural Netw. Learn. Syst., vol. 32, no. 1, pp. 4-24, Jan. 2021.
2.
D. Jin et al., "A survey of community detection approaches: From statistical modeling to deep learning", IEEE Trans. Knowl. Data Eng., Aug. 2021.
3.
C. Shi, Y. Li, J. Zhang, Y. Sun and P. S. Yu, "A survey of heterogeneous information network analysis", IEEE Trans. Knowl. Data Eng., vol. 29, no. 1, pp. 17-37, Jan. 2017.
4.
X. Wang et al., "Heterogeneous graph attention network", Proc. World Wide Web Conf., pp. 2022-2032, May 2019.
5.
S. Yun, M. Jeong, R. Kim, J. Kang and H. J. Kim, "Graph transformer networks", Proc. NeurIPS, pp. 11960-11970, 2019.
6.
X. Fu, J. Zhang, Z. Meng and I. King, "MAGNN: Metapath aggregated graph neural network for heterogeneous graph embedding", Proc. Web Conf., pp. 2331-2341, Apr. 2020.
7.
C. Zhang, D. Song, C. Huang, A. Swami and N. V. Chawla, "Heterogeneous graph neural network", Proc. SIGKDD, pp. 793-803, 2019.
8.
Y. Sun and J. Han, "Mining heterogeneous information networks: A structural analysis approach", ACM SIGKDD Explor. Newslett., vol. 14, no. 2, pp. 20-28, 2013.
9.
C. Park, J. Han and H. Yu, "Deep multiplex graph infomax: Attentive multiplex network embedding using global information", Knowl.-Based Syst., vol. 197, Jun. 2020.
10.
J. Zhao, X. Wang, C. Shi, Z. Liu and Y. Ye, "Network schema preserving heterogeneous information network embedding", Proc. 29th Int. Joint Conf. Artif. Intell., pp. 1366-1372, Jul. 2020.
11.
T. N. Kipf and M. Welling, "Semi-supervised classification with graph convolutional networks", Proc. ICLR, pp. 1-14, 2017.
12.
Z. Yu et al., "AS-GCN: Adaptive semantic architecture of graph convolutional networks for text-rich networks", Proc. IEEE Int. Conf. Data Mining (ICDM), pp. 837-846, Dec. 2021.
13.
L. Bai, L. Cui, Y. Jiao, L. Rossi and E. R. Hancock, "Learning backtrackless aligned-spatial graph convolutional networks for graph classification", IEEE Trans. Pattern Anal. Mach. Intell., vol. 44, no. 2, pp. 783-798, Feb. 2022.
14.
L. Bai et al., "Learning graph convolutional networks based on quantum vertex information propagation", IEEE Trans. Knowl. Data Eng., Aug. 2021.
15.
W. Sheng and X. Li, "Multi-task learning for gait-based identity recognition and emotion recognition using attention enhanced temporal graph convolutional network", Pattern Recognit., vol. 114, Jun. 2021.
16.
Y. Sun, J. Han, X. Yan, P. S. Yu and T. Wu, "PathSim: Meta path-based top-K similarity search in heterogeneous information networks", Proc. VLDB Endowment, vol. 4, no. 11, pp. 992-1003, 2011.
17.
J. Shang, M. Qu, J. Liu, L. M. Kaplan, J. Han and J. Peng, "Meta-path guided embedding for similarity search in large-scale heterogeneous information networks" in arXiv:1610.09769, 2016.
18.
H. Hong, H. Guo, Y. Lin, X. Yang, Z. Li and J. Ye, "An attention-based graph neural network for heterogeneous structural learning", Proc. AAAI, pp. 4132-4139, 2020.
19.
Z. Hu, Y. Dong, K. Wang and Y. Sun, "Heterogeneous graph transformer", Proc. Web Conf., pp. 2704-2710, Apr. 2020.
20.
M. Schlichtkrull, T. N. Kipf, P. Bloem, R. Van Den Berg, I. Titov and M. Welling, "Modeling relational data with graph convolutional networks", Proc. ESWC, pp. 593-607, 2018.
21.
S. Vashishth, S. Sanyal, V. Nitin and P. Talukdar, "Composition-based multi-relational graph convolutional networks", Proc. ICLR, pp. 1-14, 2020.
22.
M. I. Belghazi et al., "Mutual information neural estimation", Proc. ICML, vol. 80, pp. 530-539, 2018.
23.
P. H. Le-Khac, G. Healy and A. F. Smeaton, "Contrastive representation learning: A framework and review", IEEE Access, vol. 8, pp. 193907-193934, 2020.
24.
A. Jaiswal, A. R. Babu, M. Z. Zadeh, D. Banerjee and F. Makedon, "A survey on contrastive self-supervised learning" in arXiv:2011.00362, 2020.
25.
T. Chen, S. Kornblith, M. Norouzi and G. E. Hinton, "A simple framework for contrastive learning of visual representations", Proc. ICML, pp. 1597-1607, 2020.
26.
M. Caron, I. Misra, J. Mairal, P. Goyal, P. Bojanowski and A. Joulin, "Unsupervised learning of visual features by contrasting cluster assignments", Proc. NeurIPS, pp. 9912-9924, 2020.
27.
J. Grill et al., "Bootstrap your own latent—A new approach to self-supervised learning", Proc. NeurIPS, pp. 21271-21284, 2020.
28.
Y. Zhu, Y. Xu, F. Yu, Q. Liu, S. Wu and L. Wang, "Deep graph contrastive representation learning", Proc. ICML, pp. 1-9, 2020.
29.
Y. Zhu, Y. Xu, F. Yu, Q. Liu, S. Wu and L. Wang, "Graph contrastive learning with adaptive augmentation" in arXiv:2010.14945, 2020.
30.
R. Zhu, Z. Tao, Y. Li and S. Li, "Automated graph learning via population based self-tuning GCN", Proc. 44th Int. ACM SIGIR Conf. Res. Develop. Inf. Retr., pp. 2096-2100, Jul. 2021.

Contact IEEE to Subscribe

References

References is not available for this document.