I. Introduction
Humans learn skills from two main sources, i.e., specialized books and working experiences. For example, a good doctor needs to get knowledge from school and practice experiences from the hospital. However, most existing artificial intelligence (AI) models only imitate the learning procedure from experiences while ignoring the former [1], [2], thus making them less explainable and worse performances. Knowledge graphs (KGs), which store the human knowledge facts in intuitive graph structures [3], are treated as potential solutions these years. While, the construction of KGs is a dynamic and continuous procedure, thus most KGs suffer from incomplete issues, hindering their effectiveness in KG-assisted applications, such as question answering [4], recommendation system [5]. To alleviate the problem, knowledge graph reasoning (KGR) has drawn increasing attention these years. It aims to infer missing facts from existing ones in KGs. Taking Fig. 1(a) as the target KG, KGR models are expected to derive the logic rules (A, father of, B)\wedge {(A, husband\ of, C)}\rightarrow(C, mother of, B), and then further infer the missing fact (Savannah, mother of, Bronny).