Loading…
Recent advances of neural text generation: Core tasks, datasets, models and challenges
In recent years, deep neural network has achieved great success in solving many natural language processing tasks. Particularly, substantial progress has been made on neural text generation, which takes the linguistic and non-linguistic input, and generates natural language text. This survey aims to...
Saved in:
Published in: | Science China. Technological sciences 2020-10, Vol.63 (10), p.1990-2010 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In recent years, deep neural network has achieved great success in solving many natural language processing tasks. Particularly, substantial progress has been made on neural text generation, which takes the linguistic and non-linguistic input, and generates natural language text. This survey aims to provide an up-to-date synthesis of core tasks in neural text generation and the architectures adopted to handle these tasks, and draw attention to the challenges in neural text generation. We first outline the mainstream neural text generation frameworks, and then introduce datasets, advanced models and challenges of four core text generation tasks in detail, including AMR-to-text generation, data-to-text generation, and two text-to-text generation tasks (i.e., text summarization and paraphrase generation). Finally, we present future research directions for neural text generation. This survey can be used as a guide and reference for researchers and practitioners in this area. |
---|---|
ISSN: | 1674-7321 1869-1900 |
DOI: | 10.1007/s11431-020-1622-y |