Loading…

Human Color Perception: The Impact of Color Perception on Fine-Grained Emotion Prediction in Movie and Television Videos

This article investigates the impact of visual color perception on fine-grained emotion prediction in videos, analyzing the contribution of color perception features in fine-grained emotion prediction. A total of 20 subjects were involved in this experiment. First, 10 subjects conducted a fine-grain...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Switzerland), 2024-12, Vol.24 (23), p.7770
Main Authors: Wang, Shuang, Liu, Jinyu, Feng, Yiming, Ren, Xiaomeng, Hu, Huibo, Zhang, Mengfei, Su, Zhibin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This article investigates the impact of visual color perception on fine-grained emotion prediction in videos, analyzing the contribution of color perception features in fine-grained emotion prediction. A total of 20 subjects were involved in this experiment. First, 10 subjects conducted a fine-grained emotional subjective evaluation experiment on 50 video clips. Then, another 10 subjects conducted a subjective evaluation experiment of color perception annotation for these 50 video clips. On this basis, the correlation and mechanism between color perceptual features and fine-grained emotions were analyzed. Finally, a fine-grained emotion prediction model was established based on the combination of objective features and color perceptual features. It was also observed that, compared to using only objective features, incorporating perceptual features enhanced the model's accuracy. This article also compared the importance of different perceptual features for each emotion and explained the mechanism between color perception and fine-grained emotions. By selecting the top 24 important features to predict emotions, the association between perceptual features and emotions was more effectively captured.
ISSN:1424-8220
1424-8220
DOI:10.3390/s24237770