Loading…

MEAS: Multimodal Emotion Analysis System for Short Videos on Social Media Platforms

Short videos have surged in popularity on social media. The emotions expressed by short videos can trigger or even magnify the public sentiment. Hence, accurate computation of these emotions is vital for social affective computing. However, the multimodal emotion analysis of short videos on social p...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on computational social systems 2024-12, p.1-13
Main Authors: Wei, Qinglan, Zhou, Yaqi, Xiang, Shenlian, Xiao, Longhui, Zhang, Yuan
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Short videos have surged in popularity on social media. The emotions expressed by short videos can trigger or even magnify the public sentiment. Hence, accurate computation of these emotions is vital for social affective computing. However, the multimodal emotion analysis of short videos on social platforms faces some challenges: the accuracy of the model is tested by the disunity of video resolution; the collection of large-scale social media data, the manual transcription and segmentation of audio content, and the precise labeling process require a lot of manpower. In this article, we have proposed an affective computing system MEAS for social short videos, which combines multiscale resolution adaptability and advanced RoBERTa model to optimize the preprocessing of high definition large size short videos and improve the contribution of text modality in emotion analysis. In addition, the system also adopts automatic audio segmentation and transcription technology to realize the efficient capture of speech forms in social short videos. Experimental results show that compared with the leading open source algorithm V2EM on the IEMOCAP dataset, the proposed method achieves a significant increase in weighted accuracy and F1 score of 4.17% and 7.29%, respectively. We constructed a novel dataset named "Bili-news" based on social platform news short videos, validating the effectiveness of the MEAS system. Through experimental verification, we also find a significant positive correlation between the emotions expressed in short videos and the social sentiments of the audience.
ISSN:2373-7476
DOI:10.1109/TCSS.2024.3490846