Loading…

Facial Micro-Expression Recognition Using Two-Dimensional Landmark Feature Maps

Emotion recognition based on facial expressions is very important for effective interaction of humans with artificial intelligence (AI) systems such as social robots. On the other hand, in real environment, it is much harder to recognize facial micro-expressions (FMEs) than facial general-expression...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2020, Vol.8, p.121549-121563
Main Authors: Choi, Dong Yoon, Song, Byung Cheol
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Emotion recognition based on facial expressions is very important for effective interaction of humans with artificial intelligence (AI) systems such as social robots. On the other hand, in real environment, it is much harder to recognize facial micro-expressions (FMEs) than facial general-expressions having rich emotions. In this paper, we propose a two-dimensional (2D) landmark feature map for effectively recognizing such FMEs. The proposed 2D landmark feature map (LFM) is obtained by transforming conventional coordinate-based landmark information into 2D image information. LFM is designed to have an advantageous property independent of the intensity of facial expression change. Also, we propose an LFM-based emotion recognition method that is an integrated framework of convolutional neural network (CNN) and long short-term memory (LSTM). Experimental results show that the proposed method achieves about 71% and 74% in the well-known micro-expression datasets, i.e., SMIC and CASME II, respectively, which outperforms the conventional methods. The performance of the proposed method was also verified through experiments on composite micro-expression dataset, which consists of SMIC, CAMSE II and SAMM, and cross-dataset validation using SMIC and CAMSE II. In addition, we prove that the proposed method is independent of facial expression intensity through an experiment on CK+ dataset. Finally, we demonstrate that the proposed method is valid even for the MAHNOB-HCI and MEVIEW datasets that are produced to monitor actual and wild emotional responses.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3006958