Loading…

Arabic News Summarization based on T5 Transformer Approach

The problem of automatic text summarization is one of the main challenging problems in the field of natural language processing. With the huge amount of data available on the internet, automatic text summarization techniques become necessary to summarize this large number of documents to extract inf...

Full description

Saved in:
Bibliographic Details
Main Authors: Ismail, Qusai, Alissa, Kefah, Duwairi, Rehab M.
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 7
container_issue
container_start_page 1
container_title
container_volume
creator Ismail, Qusai
Alissa, Kefah
Duwairi, Rehab M.
description The problem of automatic text summarization is one of the main challenging problems in the field of natural language processing. With the huge amount of data available on the internet, automatic text summarization techniques become necessary to summarize this large number of documents to extract information quickly and efficiently. In this work, the automatic text summarization problem has been investigated using Transfer Learning with a customized Unified Text-to-Text Transformer T5 (t5-arabic-base) model. The t5-arabic-base was fine-tuned on Aljazeera.net news dataset and produced state-of-the-art performance in abstractive automatic text summarization. The experiments achieved F1-measure for ROUGE1, ROUGE2, and ROUGEL equal to 62.84%, 54.84%, 61.98%, respectively. Finally, we explained the model's reasoning process using heat maps and saliency maps. In addition to that, the model's sensitivity to slight perturbations to the input was discussed by using adversarial examples generated using Input Reduction and HotFlip techniques.
doi_str_mv 10.1109/ICICS60529.2023.10330509
format conference_proceeding
fullrecord <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_10330509</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10330509</ieee_id><sourcerecordid>10330509</sourcerecordid><originalsourceid>FETCH-LOGICAL-i119t-1089a891ee45f5b5677a59add93cd173bd5380a427a1a0894838f0edcb245cc43</originalsourceid><addsrcrecordid>eNo1j81KxDAUhaMgOIx9Axd5gdab3KRJ3JXiT2HQxdT1cJumGLHTko6IPr0FdXXO4uPwHca4gEIIcDdN3dT7ErR0hQSJhQBE0ODOWOaMs6gBwdgSz9lGaoM5oiovWbYsbwCAElAps2G3VaIuev4UPhe-_xhHSvGbTnE68o6W0PO1tJq3iY7LMKUxJF7Nc5rIv16xi4Hel5D95Za93N-19WO-e35o6mqXRyHcKRdgHVknQlB60J0ujSHtqO8d-l4Y7HqNFkhJQ4JWVlm0A4Ted1Jp7xVu2fXvbgwhHOYUV8evw_9d_AHm0EjR</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Arabic News Summarization based on T5 Transformer Approach</title><source>IEEE Xplore All Conference Series</source><creator>Ismail, Qusai ; Alissa, Kefah ; Duwairi, Rehab M.</creator><creatorcontrib>Ismail, Qusai ; Alissa, Kefah ; Duwairi, Rehab M.</creatorcontrib><description>The problem of automatic text summarization is one of the main challenging problems in the field of natural language processing. With the huge amount of data available on the internet, automatic text summarization techniques become necessary to summarize this large number of documents to extract information quickly and efficiently. In this work, the automatic text summarization problem has been investigated using Transfer Learning with a customized Unified Text-to-Text Transformer T5 (t5-arabic-base) model. The t5-arabic-base was fine-tuned on Aljazeera.net news dataset and produced state-of-the-art performance in abstractive automatic text summarization. The experiments achieved F1-measure for ROUGE1, ROUGE2, and ROUGEL equal to 62.84%, 54.84%, 61.98%, respectively. Finally, we explained the model's reasoning process using heat maps and saliency maps. In addition to that, the model's sensitivity to slight perturbations to the input was discussed by using adversarial examples generated using Input Reduction and HotFlip techniques.</description><identifier>EISSN: 2573-3346</identifier><identifier>EISBN: 9798350307863</identifier><identifier>DOI: 10.1109/ICICS60529.2023.10330509</identifier><language>eng</language><publisher>IEEE</publisher><subject>adversarial attacks ; Automatic Arabic text summarization ; Communication systems ; Deep learning ; model interpretability ; Perturbation methods ; Robustness ; Sensitivity ; Transfer learning ; Transformers</subject><ispartof>2023 14th International Conference on Information and Communication Systems (ICICS), 2023, p.1-7</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10330509$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,777,781,786,787,27906,54536,54913</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10330509$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Ismail, Qusai</creatorcontrib><creatorcontrib>Alissa, Kefah</creatorcontrib><creatorcontrib>Duwairi, Rehab M.</creatorcontrib><title>Arabic News Summarization based on T5 Transformer Approach</title><title>2023 14th International Conference on Information and Communication Systems (ICICS)</title><addtitle>ICICS</addtitle><description>The problem of automatic text summarization is one of the main challenging problems in the field of natural language processing. With the huge amount of data available on the internet, automatic text summarization techniques become necessary to summarize this large number of documents to extract information quickly and efficiently. In this work, the automatic text summarization problem has been investigated using Transfer Learning with a customized Unified Text-to-Text Transformer T5 (t5-arabic-base) model. The t5-arabic-base was fine-tuned on Aljazeera.net news dataset and produced state-of-the-art performance in abstractive automatic text summarization. The experiments achieved F1-measure for ROUGE1, ROUGE2, and ROUGEL equal to 62.84%, 54.84%, 61.98%, respectively. Finally, we explained the model's reasoning process using heat maps and saliency maps. In addition to that, the model's sensitivity to slight perturbations to the input was discussed by using adversarial examples generated using Input Reduction and HotFlip techniques.</description><subject>adversarial attacks</subject><subject>Automatic Arabic text summarization</subject><subject>Communication systems</subject><subject>Deep learning</subject><subject>model interpretability</subject><subject>Perturbation methods</subject><subject>Robustness</subject><subject>Sensitivity</subject><subject>Transfer learning</subject><subject>Transformers</subject><issn>2573-3346</issn><isbn>9798350307863</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2023</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNo1j81KxDAUhaMgOIx9Axd5gdab3KRJ3JXiT2HQxdT1cJumGLHTko6IPr0FdXXO4uPwHca4gEIIcDdN3dT7ErR0hQSJhQBE0ODOWOaMs6gBwdgSz9lGaoM5oiovWbYsbwCAElAps2G3VaIuev4UPhe-_xhHSvGbTnE68o6W0PO1tJq3iY7LMKUxJF7Nc5rIv16xi4Hel5D95Za93N-19WO-e35o6mqXRyHcKRdgHVknQlB60J0ujSHtqO8d-l4Y7HqNFkhJQ4JWVlm0A4Ted1Jp7xVu2fXvbgwhHOYUV8evw_9d_AHm0EjR</recordid><startdate>20231121</startdate><enddate>20231121</enddate><creator>Ismail, Qusai</creator><creator>Alissa, Kefah</creator><creator>Duwairi, Rehab M.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>20231121</creationdate><title>Arabic News Summarization based on T5 Transformer Approach</title><author>Ismail, Qusai ; Alissa, Kefah ; Duwairi, Rehab M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i119t-1089a891ee45f5b5677a59add93cd173bd5380a427a1a0894838f0edcb245cc43</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2023</creationdate><topic>adversarial attacks</topic><topic>Automatic Arabic text summarization</topic><topic>Communication systems</topic><topic>Deep learning</topic><topic>model interpretability</topic><topic>Perturbation methods</topic><topic>Robustness</topic><topic>Sensitivity</topic><topic>Transfer learning</topic><topic>Transformers</topic><toplevel>online_resources</toplevel><creatorcontrib>Ismail, Qusai</creatorcontrib><creatorcontrib>Alissa, Kefah</creatorcontrib><creatorcontrib>Duwairi, Rehab M.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Ismail, Qusai</au><au>Alissa, Kefah</au><au>Duwairi, Rehab M.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Arabic News Summarization based on T5 Transformer Approach</atitle><btitle>2023 14th International Conference on Information and Communication Systems (ICICS)</btitle><stitle>ICICS</stitle><date>2023-11-21</date><risdate>2023</risdate><spage>1</spage><epage>7</epage><pages>1-7</pages><eissn>2573-3346</eissn><eisbn>9798350307863</eisbn><abstract>The problem of automatic text summarization is one of the main challenging problems in the field of natural language processing. With the huge amount of data available on the internet, automatic text summarization techniques become necessary to summarize this large number of documents to extract information quickly and efficiently. In this work, the automatic text summarization problem has been investigated using Transfer Learning with a customized Unified Text-to-Text Transformer T5 (t5-arabic-base) model. The t5-arabic-base was fine-tuned on Aljazeera.net news dataset and produced state-of-the-art performance in abstractive automatic text summarization. The experiments achieved F1-measure for ROUGE1, ROUGE2, and ROUGEL equal to 62.84%, 54.84%, 61.98%, respectively. Finally, we explained the model's reasoning process using heat maps and saliency maps. In addition to that, the model's sensitivity to slight perturbations to the input was discussed by using adversarial examples generated using Input Reduction and HotFlip techniques.</abstract><pub>IEEE</pub><doi>10.1109/ICICS60529.2023.10330509</doi><tpages>7</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier EISSN: 2573-3346
ispartof 2023 14th International Conference on Information and Communication Systems (ICICS), 2023, p.1-7
issn 2573-3346
language eng
recordid cdi_ieee_primary_10330509
source IEEE Xplore All Conference Series
subjects adversarial attacks
Automatic Arabic text summarization
Communication systems
Deep learning
model interpretability
Perturbation methods
Robustness
Sensitivity
Transfer learning
Transformers
title Arabic News Summarization based on T5 Transformer Approach
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-19T17%3A00%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Arabic%20News%20Summarization%20based%20on%20T5%20Transformer%20Approach&rft.btitle=2023%2014th%20International%20Conference%20on%20Information%20and%20Communication%20Systems%20(ICICS)&rft.au=Ismail,%20Qusai&rft.date=2023-11-21&rft.spage=1&rft.epage=7&rft.pages=1-7&rft.eissn=2573-3346&rft_id=info:doi/10.1109/ICICS60529.2023.10330509&rft.eisbn=9798350307863&rft_dat=%3Cieee_CHZPO%3E10330509%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i119t-1089a891ee45f5b5677a59add93cd173bd5380a427a1a0894838f0edcb245cc43%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10330509&rfr_iscdi=true