Loading [MathJax]/extensions/MathZoom.js
Incorporating Dynamic Temperature Estimation into Contrastive Learning on Graphs | IEEE Conference Publication | IEEE Xplore

Incorporating Dynamic Temperature Estimation into Contrastive Learning on Graphs


Abstract:

Contrastive learning, a powerful self-supervised learning paradigm, has shown its efficacy in learning embed dings from independent and identically distributed (IID) as w...Show More

Abstract:

Contrastive learning, a powerful self-supervised learning paradigm, has shown its efficacy in learning embed dings from independent and identically distributed (IID) as well as non-IID data without relying on label information. Since high-quality discriminative embeddings form a rich embedding space, which benefits model performance on downstream tasks, it is necessary to study how to improve the quality of contrastive node embeddings in graph contrastive learning. However, there has been limited research on this area. In this paper, we investigate how to generate high-quality contrastive node embeddings based on an in-depth analysis of graph contrastive losses. Firstly, we propose a novel and effective method, GLATE, for estimating the temperatures in three mainstream graph contrastive losses during the training phase. Secondly, we conduct the derivation of GLATE, and the derivation results reveal the specific relationship between the quality of contrastive node embeddings and tem-peratures. Finally, the extensive experiments on 16 benchmark datasets demonstrate that GLATE consistently outperforms the state-of-the-art graph contrastive learning models in terms of both model performance and training efficiency.
Date of Conference: 13-16 May 2024
Date Added to IEEE Xplore: 23 July 2024
ISBN Information:

ISSN Information:

Conference Location: Utrecht, Netherlands

Funding Agency:

References is not available for this document.

I. Introduction

Self-supervised learning provides a promising learning paradigm without relying on high-cost label information for many research fields such as computer vision [1]–[3], natural language processing [4]–[7], speech recognition [8]–[10], and recommender systems [11]–[13]. Contrastive-based methods have a prominent place among the landscape of self-supervised learning methods [14]–[17]. Contrastive learning leverages the inherent structure and relationships within unlabeled data to train encoder networks [18]–[20]. The core idea behind contrastive learning is to map positives (i.e., positive samples) closer in the embedding space while pushing negatives (i.e., negative samples) apart. This process encourages the encoder network to capture intricate patterns, semantic relationships, and underlying structures present in the data, making it particularly adept at learning useful embeddings from diverse and complex datasets. Recently, researchers have explored the graph contrastive learning (GCL) framework for self-supervised learning on graphs [21]–[25]. On benchmark datasets, the state-of-the-art GCL models have demonstrated competitive performance against supervised learning models, e.g., graph convolutional network (GCN) [26], in various graph-related tasks such as node classification [27]–[31], graph classification [32], [33], and link prediction [34]–[37].

Select All
1.
C. J. Reed, X. Yue, A. Nrusimha, S. Ebrahimi, V. Vijaykumar, R. Mao, et al., "Self-supervised pretraining improves self-supervised pretraining", Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), pp. 2584-2594, January 2022.
2.
T. Huynh, S. Kornblith, M. R. Walter, M. Maire and M. Khademi, "Boosting contrastive self-supervised learning with false negative can-cellation", Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), pp. 2785-2795, January 2022.
3.
J. Teng, W. Huang and H. He, "Can pretext-based self-supervised learning be boosted by downstream data? a theoretical analysis", Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, vol. 151, pp. 4198-4216, 28–30 Mar 2022.
4.
T. Gao, X. Yao and D. Chen, "Simcse: Simple contrastive learning of sentence embeddings" in Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing EMNLP 2021 Virtual Event / Punta Cana Dominican Republic 7–11 November 2021, Association for Computational Linguistics, pp. 6894-6910, 2021.
5.
Y. Zhang, R. Zhang, S. Mensah, X. Liu and Y. Mao, "Unsupervised sentence representation via contrastive learning with mixing negatives" in Thirty-Sixth AAAI Conference on Artificial Intelligence AAAI 2022 Virtual Event February 22 - March 1 2022, AAAI Press, pp. 11730-11738, 2022.
6.
Y. Meng, C. Xiong, P. Bajaj, S. Tiwary, P. Bennett, J. Han, et al., "COCO-LM: correcting and contrasting text sequences for language model pretraining", Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021 NeurIPS 2021 December 6–14 2021 virtual, pp. 23102-23114, 2021.
7.
K. Nozawa and I. Sato, "Understanding negative samples in instance discriminative self-supervised representation learning", Advances in Neural Information Processing Systems, vol. 34, pp. 5784-5797, 2021.
8.
I. Gat, H. Aronowitz, W. Zhu, E. Morais and R. Hoory, "Speaker normalization for self-supervised speech emotion recognition", ICASSP 2022 – 2022 IEEE International Conference on Acoustics Speech and Signal Processing (ICASSP), pp. 7342-7346, 2022.
9.
D. Cai, W. Wang and M. Li, "Incorporating visual information in audio based self-supervised speaker recognition", IEEE/ACM Transactions on Audio Speech and Language Processing, vol. 30, pp. 1422-1435, 2022.
10.
M. Sang, H. Li, F. Liu, A. O. Arnold and L. Wan, "Self-supervised speaker verification with simple siamese network and self-supervised regularization", ICASSP 2022 – 2022 IEEE International Conference on Acoustics Speech and Signal Processing (ICASSP), pp. 6127-6131, 2022.
11.
H. Wu, Y. Zhang, C. Ma, W. Guo, R. Tang, X. Liu, et al., "Intent-aware multi-source contrastive alignment for tag-enhanced rec-ommendation", 39th IEEE International Conference on Data Engi-neering ICDE 2023 Anaheim CA USA April 3–7 2023, pp. 1112-1125, 2023.
12.
X. Du, H. Yuan, P. Zhao, J. Fang, G. Liu, Y. Liu, et al., "Contrastive enhanced slide filter mixer for sequential recommendation", 39th IEEE International Conference on Data Engineering ICDE 2023 Anaheim CA USA April 3–7 2023, pp. 2673-2685, 2023.
13.
J. Gong, Z. Chen, C. Ma, Z. Xiao, H. Wang, G. Tang, et al., "Attention weighted mixture of experts with contrastive learning for personalized ranking in e-commerce", 39th IEEE International Conference on Data Engineering ICDE 2023 Anaheim CA USA April 3–7 2023, pp. 3222-3234, 2023.
14.
X. Liu, F. Zhang, Z. Hou, L. Mian, Z. Wang, J. Zhang, et al., "Self-supervised learning: Generative or contrastive", IEEE Trans. Knowl. Data Eng., vol. 35, no. 1, pp. 857-876, 2023.
15.
L. Wu, H. Lin, C. Tan, Z. Gao and S. Z. Li, "Self-supervised learning on graphs: Contrastive generative or predictive", IEEE Trans. Knowl. Data Eng., vol. 35, no. 4, pp. 4216-4235, 2023.
16.
Y. Liu, M. Jin, S. Pan, C. Zhou, Y. Zheng, F. Xia, et al., "Graph self-supervised learning: A survey", IEEE Trans. Knowl. Data Eng., vol. 35, no. 6, pp. 5879-5900, 2023.
17.
H. Li, J. Cao, J. Zhu, Q. Luo, S. He and X. Wang, "Augmentation-free graph contrastive learning of invariant-discriminative representations", IEEE Transactions on Neural Networks and Learning Systems, 2023.
18.
Y. Xu, B. Shi, T. Ma, B. Dong, H. Zhou and Q. Zheng, "Cldg: Contrastive learning on dynamic graphs", 2023 IEEE 39th International Conference on Data Engineering (ICDE), pp. 696-707, 2023.
19.
L. Li, S. Luo, Y. Zhao, C. Shan, Z. Wang and L. Qin, "Coclep: Contrastive learning-based semi-supervised community search", IEEE 39th ICDE, 2023.
20.
R. Wang, Y. Li and J. Wang, "Sudowoodo: Contrastive self-supervised learning for multi-purpose data integration and preparation", 39th IEEE International Conference on Data Engineering ICDE 2023 Anaheim CA USA April 3–7 2023, pp. 1502-1515, 2023.
21.
P. Velickovic, W. Fedus, W. L. Hamilton, P. Liò, Y. Bengio and R. D. Hjelm, "Deep graph infomax", ICLR (Poster), vol. 2, no. 3, pp. 4, 2019.
22.
Y. Zhu, Y. Xu, F. Yu, Q. Liu, S. Wu and L. Wang, "Deep graph contrastive representation learning", GRL+@ICML 2020, 2020.
23.
S. Thakoor, C. Tallec, M. G. Azar, R. Munos, P. Veličković and M. Valko, "Bootstrapped representation learning on graphs", ICLR 2021 Workshop, 2021.
24.
J. Xia, L. Wu, J. Chen, B. Hu and S. Z. Li, "Simgrace: A simple framework for graph contrastive learning without data augmentation", Proceedings of the ACM Web Conference 2022, pp. 1070-1079, 2022.
25.
S. Feng, B. Jing, Y. Zhu and H. Tong, "Adversarial graph contrastive learning with information regularization", Proceedings of the ACM Web Conference 2022, pp. 1362-1371, 2022.
26.
T. N. Kipf and M. Welling, "Semi-supervised classification with graph convolutional networks", 5th International Conference on Learning Representations ICLR 2017 Toulon France April 24–26 2017 Conference Track Proceedings OpenReview.net, 2017.
27.
J. Layne, J. Carpenter, E. Serra and F. Gullo, "Temporal sir-gn: Efficient and effective structural representation learning for temporal graphs", Proceedings of the VLDB Endowment, vol. 16, no. 9, pp. 2075-2089, 2023.
28.
Y. Zhang and A. Kumar, "Lotan: Bridging the gap between gnns and scalable graph analytics engines", Proceedings of the VLDB Endowment, vol. 16, no. 11, pp. 2728-2741, 2023.
29.
F. Xiao, Y. Wu, M. Zhang, G. Chen and B. C. Ooi, "Mint: Detecting fraudulent behaviors from time-series relational data", Proceedings of the VLDB Endowment, vol. 16, no. 12, pp. 3610-3623, 2023.
30.
X. Du, X. Zhang, S. Wang and Z. Huang, "Efficient tree-svd for subset node embedding over large dynamic graphs", Proc. ACM Manag. Data, vol. 1, no. 1, pp. 96:1-96:26, 2023.
Contact IEEE to Subscribe

References

References is not available for this document.