Loading…

CK-Encoder: Enhanced Language Representation for Sentence Similarity

In recent years, neural networks have been widely used in natural language processing, especially in sentence similarity modeling. Most of the previous studies focused on the current sentence, ignoring the commonsense knowledge related to the current sentence in the task of sentence similarity model...

Full description

Saved in:
Bibliographic Details
Published in:International journal of crowd science 2022-04, Vol.6 (1), p.17-22
Main Authors: Jiang, Tao, Kang, Fengjian, Guo, Wei, He, Wei, Liu, Lei, Lu, Xudong, Xu, Yonghui, Cui, Lizhen
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In recent years, neural networks have been widely used in natural language processing, especially in sentence similarity modeling. Most of the previous studies focused on the current sentence, ignoring the commonsense knowledge related to the current sentence in the task of sentence similarity modeling. Commonsense knowledge can be remarkably useful for understanding the semantics of sentences. CK-Encoder, which can effectively acquire commonsense knowledge to improve the performance of sentence similarity modeling, is proposed in this paper. Specifically, the model first generates a commonsense knowledge graph of the input sentence and calculates this graph by using the graph convolution network. In addition, CKER, a framework combining CK-Encoder and sentence encoder, is introduced. Experiments on two sentence similarity tasks have demonstrated that CK-Encoder can effectively acquire commonsense knowledge to improve the capability of a model to understand sentences.
ISSN:2398-7294
2398-7294
DOI:10.26599/IJCS.2022.9100001