Loading…
Convolutional neural networks for posed and spontaneous expression recognition
Differentiating posed expressions from spontaneous ones is a more challenging task than conventional facial expression recognition. There are many methods proposed to differentiate posed and spontaneous expressions based on pixel level information. However, these methods still have some limitations...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Citations: | Items that cite this one |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Differentiating posed expressions from spontaneous ones is a more challenging task than conventional facial expression recognition. There are many methods proposed to differentiate posed and spontaneous expressions based on pixel level information. However, these methods still have some limitations : (1) Most of the studies use the difference between onset (the early stages of an expression) and apex (the most intense stages of an expression) pixel-level raw images as inputs, which may not only contain noisy information, but also lose some useful information. (2) A lot of previous work uses hand-crafted features designed by rules, which suffers from inadequate capability of abstraction and representation. Considering that the high-level image representations usually have less noisy information, we propose a special layer named "comparison layer" for convolutional neural network (CNN) to measure the difference between onset and apex images of high-level representations (instead of pixel-level difference). We add the comparison layer to a group of CNNs, and combine the learned representations from those CNNs to form inputs of a classifier for differentiating posed and spontaneous expressions. Experiments on USTC-NVIE database (so far the largest database for this task) show that our method significantly outperforms the state-of-the-art methods (91.73% to 97.98%). |
---|---|
ISSN: | 1945-788X |
DOI: | 10.1109/ICME.2017.8019373 |