As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Link prediction is a crucial research area for both data mining and machine learning. Despite the success of contrastive learning in node classification tasks, applying it directly to link prediction tasks has revealed two major weaknesses, i.e., single positive sample contrasting and random augmentation, resulting in inferior performance. To overcome these issues, we propose a new contrastive learning approach for link prediction, called Structure-aware Contrastive Representation Learning with Self-discriminating Augmentation (SECRET). Our approach includes a novel data augmentation scheme based on the prediction model itself and takes into account both the contrastive objective and the reconstruction loss, which jointly improve the performance of link prediction. Our experiments on 11 benchmark datasets demonstrate that SECRET significantly outperforms the other state-of-the-art baselines.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.