Loading…
Summarization of COVID-19 news documents deep learning-based using transformer architecture
Received Jul 25, 2020 Revised Oct 12, 2020 Accepted Oct 23, 2020 Keywords: COVID-19 Deep learning News summarization Transformer architecture ABSTRACT Facing the news on the internet about the spreading of Corona virus disease 2019 (COVID-19) is challenging because it is required a long time to get...
Saved in:
Published in: | Telkomnika 2021-06, Vol.19 (3), p.754-761 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Received Jul 25, 2020 Revised Oct 12, 2020 Accepted Oct 23, 2020 Keywords: COVID-19 Deep learning News summarization Transformer architecture ABSTRACT Facing the news on the internet about the spreading of Corona virus disease 2019 (COVID-19) is challenging because it is required a long time to get valuable information from the news. The other model that was used for abstractive news summary is sequence modeling, such as long short-term memory (LSTM) and recurrent neural network (RNN). Transformer, as base language models, has significantly impacted the NLP research field to replace the deficiency of both LSTM, CNN and RNN based as a deep learning architecture [12, 13], so that many reasons why the transformer was chosen as base model architecture. The attention function can be defined as a function that performs the mapping of the query Q is the target sequence; the key pair K and the value V are derived from the sequence. |
---|---|
ISSN: | 1693-6930 2302-9293 |
DOI: | 10.12928/telkomnika.v19i3.18356 |