Loading…
Learning high-order structural and attribute information by knowledge graph attention networks for enhancing knowledge graph embedding
The goal of representation learning of knowledge graph is to encode both entities and relations into a low-dimensional embedding space. Many recent works have demonstrated the benefits of knowledge graph embedding on knowledge graph completion task, such as relation extraction. However, we observe t...
Saved in:
Published in: | Knowledge-based systems 2022-08, Vol.250, p.109002, Article 109002 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The goal of representation learning of knowledge graph is to encode both entities and relations into a low-dimensional embedding space. Many recent works have demonstrated the benefits of knowledge graph embedding on knowledge graph completion task, such as relation extraction. However, we observe that: (1) existing methods simply take direct relations between entities into consideration and fails to express high-order structural relationships between entities; (2) these methods simply leverage relation triples of Knowledge Graphs while ignoring a large number of attribute triples that encode rich semantic information. To overcome these limitations, this paper proposes a novel knowledge graph embedding method, named (KANE), which is inspired by the recent developments in graph convolutional networks (GCN). KANE can capture both high-order structural and attribute information of Knowledge Graphs in an efficient, explicit and unified manner under the graph convolutional networks framework. Empirical results on three datasets show that KANE significantly outperforms seven state-of-the-art methods. Further analysis verify the efficiency of our method and the benefits brought by the attention mechanism. |
---|---|
ISSN: | 0950-7051 1872-7409 |
DOI: | 10.1016/j.knosys.2022.109002 |