Loading…

Real-time emotion recognition using end-to-end attention-based fusion network

Real-time emotion detection based on facial expression is an innovative research field that has been applied in several areas, such as health, human–machine vision, and autonomous safety. Researchers in object detection are involved in developing methods to interpret, code facial expressions, and ex...

Full description

Saved in:
Bibliographic Details
Published in:Journal of electronic imaging 2023-01, Vol.32 (1), p.013050-013050
Main Authors: Shit, Sahadeb, Rana, Aiswarya, Das, Dibyendu Kumar, Ray, Dip Narayan
Format: Article
Language:English
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Real-time emotion detection based on facial expression is an innovative research field that has been applied in several areas, such as health, human–machine vision, and autonomous safety. Researchers in object detection are involved in developing methods to interpret, code facial expressions, and extract these features to be better predicted by machines. Furthermore, the success of deep learning with different architectures is exploited to achieve better performance. But these methods drastically fail in excessive sweating in different health conditions. We aim to create a dataset in different health conditions and detect facial emotion using the encoder and decoder-based deep learning methodology. The proposed architecture and the dataset present the progress made by comparing the other proposed methods and the quantitative and qualitative results obtained. The major benefit of our study is to enhance the emotion detection efficiency with other proposed methods and real-time applications for different health conditions. We propose the application of feature extraction of facial expressions with an end-to-end attention module-based fusion network for detecting different facial emotions (happy, angry, neutral, surprised, etc.) with an accuracy of 99.68%. The proposed system depends upon the human face; as we know, the face reflects human brain activities or emotions.
ISSN:1017-9909
1560-229X
DOI:10.1117/1.JEI.32.1.013050