Loading…
Entity Highlight Generation as Statistical and Neural Machine Translation
Entity highlight refers to a short, concise, and characteristic description for an entity, which can be applied to various applications. In this article, we study the problem of automatically generating entity highlights from the descriptive sentences of entities. Specifically, we develop two comput...
Saved in:
Published in: | IEEE/ACM transactions on audio, speech, and language processing speech, and language processing, 2018-10, Vol.26 (10), p.1860-1872 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Entity highlight refers to a short, concise, and characteristic description for an entity, which can be applied to various applications. In this article, we study the problem of automatically generating entity highlights from the descriptive sentences of entities. Specifically, we develop two computational approaches, one is inspired by the statistical machine translation (SMT) and another is a sequence-to-sequence learning (Seq2Seq) approach, which has been successfully applied in neural machine translation and neural summarization. In the Seq2Seq approach, we use attention mechanism, copy mechanism, and coverage mechanism. To generate entity-specific highlights, we also incorporate entity name into the Seq2Seq model to guide the decoding process. We automatically collect large-scale instances as training data without any manual annotation, and ask annotators to create a test set. We compare with several strong baseline methods, and evaluate the approaches with both automatic evaluation and manual evaluation. Experimental results show that the entity enhanced Seq2Seq model with attention, copy, and coverage mechanisms significantly outperforms all other approaches in terms of multiple evaluation metrics. 1 |
---|---|
ISSN: | 2329-9290 2329-9304 |
DOI: | 10.1109/TASLP.2018.2845111 |