Loading…
Dual syntax aware graph attention networks with prompt for aspect-based sentiment analysis
Aspect-based sentiment analysis (ABSA) is a challenging task due to the presence of multiple aspect words with different sentiment polarities in a sentence. Recently, pre-trained language models like BERT have been widely used as context encoders in ABSA. Graph neural networks have also been employe...
Saved in:
Published in: | Scientific reports 2024-10, Vol.14 (1), p.23528-12, Article 23528 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Aspect-based sentiment analysis (ABSA) is a challenging task due to the presence of multiple aspect words with different sentiment polarities in a sentence. Recently, pre-trained language models like BERT have been widely used as context encoders in ABSA. Graph neural networks have also been employed to extract syntactic and semantic information from sentence parsing trees, resulting in superior results. However, dependency trees may establish irrelevant dependencies for sentences with irregular syntax and complex structures. Additionally, previous methods have not fully utilized recent developments in pre-trained language models. Therefore, we propose a Dual Syntax aware Graph attention networks with Prompt (DSGP) model to address these issues. Our model utilizes prompt templates to maximize the potential of pre-trained models and masked vector outputs of templates as supplementary aspect feature representations. We also leverage both dependency trees and constituent trees with graph attention networks to extract different types of syntactic information. The dependency tree captures syntactic correlation between words, while the constituent tree provides a high-level formation of the sentence. Finally, the output from the prompt and parsing trees is fused and fed into a standard classifier. Experimental results on four public datasets demonstrate the competitive performance of our model. |
---|---|
ISSN: | 2045-2322 2045-2322 |
DOI: | 10.1038/s41598-024-74668-y |