Loading…

Geometric-Convolutional Feature Fusion Based on Learning Propagation for Facial Expression Recognition

Facial expression is the main approach for humans to express their emotions. It is the temporal-spatial information that can be recognized by computers. In this paper, three video-based models are proposed for the facial expression recognition system (FERS). First, a differential geometric fusion ne...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2018-01, Vol.6, p.42532-42540
Main Authors: Tang, Yan, Zhang, Xing Ming, Wang, Haoxiang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Facial expression is the main approach for humans to express their emotions. It is the temporal-spatial information that can be recognized by computers. In this paper, three video-based models are proposed for the facial expression recognition system (FERS). First, a differential geometric fusion network (DGFN) is proposed, which utilizes the technique of the handcrafted feature for traditional machine learning. The static geometric feature in the DGFN, which is based on the critical regions of psychology and the rules of physiology, is converted into the differential geometric feature by the geometric fusion model. Then deep-facial-sequential network (DFSN) is designed based on a multi-dimensional convolutional neural network (CNN). Finally, the DFSN-I is proposed, which is the combination of the DGFN and the DFSN taking advantages of both to achieve better performance. The experimental result shows that the combination of the handcrafted feature with prior experience and the auto-extracted feature provides better performance. It also shows that our DFSN and DFSN-I outperform the state-of-the-art methods on the Oulu-CASIA data set and achieve almost the best performance on CK+ compared with the other video-based methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2018.2858278