Loading…

Efficient Relational Sentence Ordering Network

In this paper, we propose a novel deep Efficient Relational Sentence Ordering Network (referred to as ERSON) by leveraging pre-trained language model in both encoder and decoder architectures to strengthen the coherence modeling of the entire model. Specifically, we first introduce a divide-and-fuse...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on pattern analysis and machine intelligence 2022-10, Vol.44 (10), p.6169-6183
Main Authors: Li, Yingming, Cui, Baiyun, Zhang, Zhongfei
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we propose a novel deep Efficient Relational Sentence Ordering Network (referred to as ERSON) by leveraging pre-trained language model in both encoder and decoder architectures to strengthen the coherence modeling of the entire model. Specifically, we first introduce a divide-and-fuse BERT (referred to as DF-BERT), a new refactor of BERT network, where lower layers in the improved model encode each sentence in the paragraph independently, which are shared by different sentence pairs, and the higher layers learn the cross-attention between sentence pairs jointly. It enables us to capture the semantic concepts and contextual information between the sentences of the paragraph, while significantly reducing the runtime and memory consumption without sacrificing the model performance. Besides, a Relational Pointer Decoder (referred to as RPD) is developed, which utilizes the pre-trained Next Sentence Prediction (NSP) task of BERT to capture the useful relative ordering information between sentences to enhance the order predictions. In addition, a variety of knowledge distillation based losses are added as auxiliary supervision to further improve the ordering performance. The extensive evaluations on Sentence Ordering, Order Discrimination, and Multi-Document Summarization tasks show the superiority of ERSON to the state-of-the-art ordering methods.
ISSN:0162-8828
2160-9292
1939-3539
DOI:10.1109/TPAMI.2021.3085738