Loading…
Convolutional neural network for knowledge graph completion
Knowledge Graph Embedding aims to learn the relationship between two entities by quantifying them in real-valued vector space. How-ever, there are two significant challenges with learning knowledge graph embedding: 1) How can deep learning techniques be used to construct expressive embeddings to the...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Knowledge Graph Embedding aims to learn the relationship between two entities by quantifying them in real-valued vector space. How-ever, there are two significant challenges with learning knowledge graph embedding: 1) How can deep learning techniques be used to construct expressive embeddings to their full potential? 2) How to deal with the polysemy problem that multi-relational knowledge graphs produce, where relations and entities have multiple semantics depending on which predictions are used? To solve the first problem, multilayer convolutional neural networks use to create the features, which are then employed in this article to anticipate potential relations. Furthermore, the network representational power is enhanced by incorporating an effective recalibration method that preferentially amplifies relevant features. We propose learning multiple specialized relationship embeddings to solve the second difficulty. Rather than developing a single generic embedding to maintain all the data for each relation and entity, their connections are recorded to depict a cross-semantic effect from entities to relations and vice versa. The model gives higher generalization performance and better recording of potential linkages between relations and entities compared to existing knowledge graph embedding models. According to experimental results, the proposed model performs better for generic evaluation metrics on relation prediction tasks. |
---|---|
ISSN: | 0094-243X 1551-7616 |
DOI: | 10.1063/5.0175625 |