Loading…

KGBoost: A classification-based knowledge base completion method with negative sampling

•A modularized design of a classification-based method for knowledge base completion.•Relation inference patterns are incorporated during training.•Two novel negative sampling strategies are proposed.•Extensive experiments and analysis on four benchmark datasets. Knowledge base completion is formula...

Full description

Saved in:
Bibliographic Details
Published in:Pattern recognition letters 2022-05, Vol.157, p.104-111
Main Authors: Wang, Yun-Cheng, Ge, Xiou, Wang, Bin, Kuo, C.-C. Jay
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•A modularized design of a classification-based method for knowledge base completion.•Relation inference patterns are incorporated during training.•Two novel negative sampling strategies are proposed.•Extensive experiments and analysis on four benchmark datasets. Knowledge base completion is formulated as a binary classification problem in this work, where an XGBoost binary classifier is trained for each relation using relevant links in knowledge graphs (KGs). The new method, named KGBoost, adopts a modularized design and attempts to find hard negative samples so as to train a powerful classifier for missing link prediction. We conduct experiments on multiple benchmark datasets and demonstrate that KGBoost outperforms state-of-the-art methods across most datasets. Furthermore, as compared with models trained by end-to-end optimization, KGBoost works well under the low-dimensional setting so as to allow a smaller model size.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2022.04.001