Loading…

Attention-Based Sentiment Region Importance and Relationship Analysis for Image Sentiment Recognition

Image sentiment recognition has attracted considerable attention from academia and industry due to the increasing tendency of expressing opinions via images and videos online. Previous studies focus on multilevel representation from global and local views to improve recognition performance. However,...

Full description

Saved in:
Bibliographic Details
Published in:Computational intelligence and neuroscience 2022-11, Vol.2022, p.1-14
Main Authors: Yang, Shanliang, Xing, Linlin, Chang, Zheng, Li, Yongming
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Image sentiment recognition has attracted considerable attention from academia and industry due to the increasing tendency of expressing opinions via images and videos online. Previous studies focus on multilevel representation from global and local views to improve recognition performance. However, it is insufficient to research the importance and relationship of visual regions for image sentiment recognition. This paper proposes an attention-based sentiment region importance and relationship (ASRIR) analysis method, including important attention and relation attention for image sentiment recognition. First, we extract spatial region features using a multilevel pyramid network from the image. Second, we design important attention to exploring the sentiment semantic-related regions and relation attention to investigating the relationship between regions. In order to release the excessive concentration of attention, we employ a unimodal function as the objective function for regularization. Finally, the region features weighted by the attention mechanism are fused and input into a fully connected layer for classification. Extensive experiments on various commonly used image sentiment datasets demonstrate that our proposed method outperforms the state-of-the-art approaches.
ISSN:1687-5265
1687-5273
DOI:10.1155/2022/9772714