Loading…

Contextual Text Embeddings for Twi

Transformer-based language models have been changing the modern Natural Language Processing (NLP) landscape for high-resource languages such as English, Chinese, Russian, etc. However, this technology does not yet exist for any Ghanaian language. In this paper, we introduce the first of such models...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2021-03
Main Authors: Azunre, Paul, Osei, Salomey, Addo, Salomey, Lawrence Asamoah Adu-Gyamfi, Moore, Stephen, Adabankah, Bernard, Opoku, Bernard, Asare-Nyarko, Clara, Nyarko, Samuel, Amoaba, Cynthia, Appiah, Esther Dansoa, Akwerh, Felix, Richard Nii Lante Lawson, Budu, Joel, Debrah, Emmanuel, Boateng, Nana, Wisdom Ofori, Buabeng-Munkoh, Edwin, Adjei, Franklin, Isaac Kojo Essel Ampomah, Otoo, Joseph, Reindorf Borkor, Standylove Birago Mensah, Mensah, Lucien, Marcel, Mark Amoako, Anokye Acheampong Amponsah, James Ben Hayfron-Acquah
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Transformer-based language models have been changing the modern Natural Language Processing (NLP) landscape for high-resource languages such as English, Chinese, Russian, etc. However, this technology does not yet exist for any Ghanaian language. In this paper, we introduce the first of such models for Twi or Akan, the most widely spoken Ghanaian language. The specific contribution of this research work is the development of several pretrained transformer language models for the Akuapem and Asante dialects of Twi, paving the way for advances in application areas such as Named Entity Recognition (NER), Neural Machine Translation (NMT), Sentiment Analysis (SA) and Part-of-Speech (POS) tagging. Specifically, we introduce four different flavours of ABENA -- A BERT model Now in Akan that is fine-tuned on a set of Akan corpora, and BAKO - BERT with Akan Knowledge only, which is trained from scratch. We open-source the model through the Hugging Face model hub and demonstrate its use via a simple sentiment classification example.
ISSN:2331-8422