Loading…

Multi-task CNN for multi-cue affects recognition using upper-body gestures and facial expressions

Researches on psychology and affective state recognition demonstrated that emotion is equally transmitted through the body and the face in most cases. In this line, the purpose of this work is to identify the affective state of the individual through his facial expression and upper body gesture. We...

Full description

Saved in:
Bibliographic Details
Published in:International journal of information technology (Singapore. Online) 2022-02, Vol.14 (1), p.531-538
Main Authors: Zaghbani, Soumaya, Bouhlel, Med Salim
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Researches on psychology and affective state recognition demonstrated that emotion is equally transmitted through the body and the face in most cases. In this line, the purpose of this work is to identify the affective state of the individual through his facial expression and upper body gesture. We are looking to recognize six emotions: anger, anxiety, boredom, fear, happiness, and sadness. To realize this work, we propose to employ multi-model classification (facial images and upper-body gesture) using multi-task convolutional neural networks. The network composed of two sub-networks. The first one is dedicated to extract facial expressions features and the second branch is for the upper body gesture actions. In the end, the two branches are combined and connected to each other and two fully connected layers are added to extract the emotions. To train the network we used the late fusion model in order to combine the two networks. Results demonstrated that the presented method presents an important accuracy achieved the 99.75% and the use of the body gestures coupled with facial expression is more effective than using them independently.
ISSN:2511-2104
2511-2112
DOI:10.1007/s41870-021-00820-w