

Link prediction for Knowledge Graphs (KGs) aims to predict missing links between entities. Previous works have utilized Graph Neural Networks (GNNs) to learn specific embeddings of entities and relations. However, these works only consider the linear aggregation of neighbors and do not consider interactions among neighbors, resulting in the neglect of partial indicating information. To address this issue, we propose Deep Interactions-boosted Embeddings (DInBE) which encodes interaction information to enrich the entity representations. To obtain interaction information, we disentangle the representation behind entities to learn diverse disentangled representations for each entity. Then, we learn intra-interactions among neighboring entities in the same component and inter-interactions among different components based on these disentangled representations. With the help of interaction information, our model generates more expressive representations. In addition, we propose a relation-aware scoring mechanism to select useful components based on the given query. Our experiments demonstrate that our proposed model outperforms existing state-of-the-art methods by a large margin in the link prediction task, and this verifies the effectiveness of exploring interactions and adaptive scoring.