Loading…

Pre-trained models for natural language processing: A survey

Recently, the emergence of pre-trained models (PTMs) has brought natural language processing (NLP) to a new era. In this survey, we provide a comprehensive review of PTMs for NLP. We first briefly introduce language representation learning and its research progress. Then we systematically categorize...

Full description

Saved in:
Bibliographic Details
Published in:Science China. Technological sciences 2020-10, Vol.63 (10), p.1872-1897
Main Authors: Qiu, XiPeng, Sun, TianXiang, Xu, YiGe, Shao, YunFan, Dai, Ning, Huang, XuanJing
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Recently, the emergence of pre-trained models (PTMs) has brought natural language processing (NLP) to a new era. In this survey, we provide a comprehensive review of PTMs for NLP. We first briefly introduce language representation learning and its research progress. Then we systematically categorize existing PTMs based on a taxonomy from four different perspectives. Next, we describe how to adapt the knowledge of PTMs to downstream tasks. Finally, we outline some potential directions of PTMs for future research. This survey is purposed to be a hands-on guide for understanding, using, and developing PTMs for various NLP tasks.
ISSN:1674-7321
1869-1900
DOI:10.1007/s11431-020-1647-3