Loading…
Evaluating AI-Based Deep Learning Models for Enhancing English Reading Comprehension Skills in Students
In the rapidly evolving field of natural language processing (NLP), enhancing model performance in understanding and generating human language has become increasingly critical. As the need for better language models increases, enhancing the current systems' shortcomings becomes crucial. There a...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In the rapidly evolving field of natural language processing (NLP), enhancing model performance in understanding and generating human language has become increasingly critical. As the need for better language models increases, enhancing the current systems' shortcomings becomes crucial. There are difficulties in existing models precisely due to the fundamental problem of striking a balance between the level of detail achievable and the amount of computational resources that a model requires for accurate estimation in practical settings. Current models, present several problems including but not limited to overfitting, average performance on out-of-sample data and less applicability for subtle linguistic features. Such obstacles limit their capability to execute in real conditions, where high degree of accuracy and precision is paramount. To overcome these challenges, this study introduces a novel framework that employs the BERT model which was developed to capture bidirectional context and achieve greater depth of the semantic relations between words. The proposed approach is designed to improve the model's effectiveness to a considerable extent through the use of mechanisms such as attention as well as transformer architecture included in BERT. The proposed BERT model demonstrates exceptional results, achieving an accuracy of 99.10% and a precision of 98.30%. These metrics reflect the model's superior ability to understand and generate accurate language representations, surpassing existing models in both effectiveness and efficiency. This advancement underscores the potential of BERT to address critical challenges in NLP, offering a promising solution for applications requiring high precision and robust language comprehension. |
---|---|
ISSN: | 2768-0673 |
DOI: | 10.1109/I-SMAC61858.2024.10714755 |