Loading…

Facial expression recognition considering individual differences in facial structure and texture

Facial expression recognition (FER) plays an important role in human–computer interaction. The recent years have witnessed an increasing trend of various approaches for the FER, but these approaches usually do not consider the effect of individual differences to the recognition result. When the face...

Full description

Saved in:
Bibliographic Details
Published in:IET computer vision 2014-10, Vol.8 (5), p.429-440
Main Authors: Yi, Jizheng, Mao, Xia, Chen, Lijiang, Xue, Yuli, Compare, Angelo
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Facial expression recognition (FER) plays an important role in human–computer interaction. The recent years have witnessed an increasing trend of various approaches for the FER, but these approaches usually do not consider the effect of individual differences to the recognition result. When the face images change from neutral to a certain expression, the changing information constituted of the structural characteristics and the texture information can provide rich important clues not seen in either face image. Therefore it is believed to be of great importance for machine vision. This study proposes a novel FER algorithm by exploiting the structural characteristics and the texture information hiding in the image space. Firstly, the feature points are marked by an active appearance model. Secondly, three facial features, which are feature point distance ratio coefficient, connection angle ratio coefficient and skin deformation energy parameter, are proposed to eliminate the differences among the individuals. Finally, a radial basis function neural network is utilised as the classifier for the FER. Extensive experimental results on the Cohn–Kanade database and the Beihang University (BHU) facial expression database show the significant advantages of the proposed method over the existing ones.
ISSN:1751-9632
1751-9640
1751-9640
DOI:10.1049/iet-cvi.2013.0171