Loading…
Tree-Structured Neural Networks With Topic Attention for Social Emotion Classification
Social emotion classification studies the emotion distribution evoked by an article among numerous readers. Although recently neural network-based methods can improve the classification performance compared with the previous word-emotion and topic-emotion approaches, they have not fully utilized som...
Saved in:
Published in: | IEEE access 2019, Vol.7, p.95505-95515 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Social emotion classification studies the emotion distribution evoked by an article among numerous readers. Although recently neural network-based methods can improve the classification performance compared with the previous word-emotion and topic-emotion approaches, they have not fully utilized some important sentence language features and document topic features. In this paper, we propose a new neural network architecture exploiting both the syntactic information of a sentence and topic distribution of a document. The proposed architecture first constructs a tree-structured long short-term memory (Tree-LSTM) network based on the sentence syntactic dependency tree to obtain a sentence vector representation. For a multi-sentence document, we then use a Chain-LSTM network to obtain the document representation from its sentences' hidden states. Furthermore, we design a topic-based attention mechanism with two attention levels. The word-level attention is used for weighting words of a single-sentence document and the sentence-level attention for weighting sentences of a multi-sentence document. The experiments on three public datasets show that the proposed scheme outperforms the state-of-the-art ones in terms of higher average Pearson correlation coefficient and MicroF1 performance. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2929204 |