

Link prediction is the main task of knowledge graph completion, predicting missing relations between entities based the existing links among the entities. The problem of knowledge graph completion can be framed as a third-order binary tensor completion problem. In this case, tensor decomposition seems like a natural solution. And many previous studies have shown that tensor decomposition methods are superior to Trans-based methods in link prediction experiments. Typical tensor decomposition methods are Canonical Polyadic (CP) decomposition and Tucker decomposition. In this paper, we propose Block term decomposition Embedding model (BTDE) for link prediction based on Block term decomposition (which can be seen as a combination of CP decomposition and Tucker decomposition) of the binary tensor representation of knowledge graph triples. The embeddings learned through BTDE is interpretable. In addition, we prove BTDE is fully expressive and derive the bound on its entity and relation embedding dimensionality for full expressivity which is the same as TuckER and smaller than the bound of previous start-of-the-art models ComplEx and SimplE. We show empirically that BTDE outperforms most previous state-of-the-art models across five standard link prediction datasets.