Loading…

An efficient automatic facial expression recognition using local neighborhood feature fusion

In computer vision, several feature extraction methods have been developed to differentiate the variations of facial expressions. But the effect of the relationship among the neighboring pixel is not considered in the existing texture encoding based method. This paper exploits the method to analyze...

Full description

Saved in:
Bibliographic Details
Published in:Multimedia tools and applications 2021-03, Vol.80 (7), p.10187-10212
Main Authors: Shanthi, P., Nickolas, S.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In computer vision, several feature extraction methods have been developed to differentiate the variations of facial expressions. But the effect of the relationship among the neighboring pixel is not considered in the existing texture encoding based method. This paper exploits the method to analyze the association among the adjacent pixels using feature fusion technique. For efficient texture representation, the proposed approach combines the Local Binary Pattern (LBP) with the Local Neighborhood Encoded Pattern (LNEP). The LBP feature encodes the relationship of adjacent pixels with respect to the central pixel whereas LNEP represents the relationship among the two closest local neighboring pixels of the current pixel. After concatenating LBP with LNEP, the most relevant features are selected using chi-square statistical analysis and classified using multiclass Support Vector Machine (SVM). Experimental findings show that the proposed hybrid feature performed better than an individual feature and it achieves an average recognition accuracy of 97.86% and 97.11% on CK+ and MMI dataset, respectively. The effectiveness of the reduced hybrid feature is also evaluated under a noisy environment and the results show better performance in such conditions.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-020-10105-2