Loading…

A Multiview Representation Framework for Micro-Expression Recognition

Multiview representation has become important due to its good performance for machine learning problems. In this paper, a multiview representation framework based on transfer learning is proposed for micro-expression recognition. The framework takes macro-expression as the auxiliary domain and micro...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2019, Vol.7, p.120670-120680
Main Authors: Huang, Tianhuan, Chen, Lei, Feng, Yuncong, Ben, Xianye, Xiao, Ruixue, Xue, Tianle
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multiview representation has become important due to its good performance for machine learning problems. In this paper, a multiview representation framework based on transfer learning is proposed for micro-expression recognition. The framework takes macro-expression as the auxiliary domain and micro-expression as the target domain, and assists the identification of micro-expressions by transferring the rich information extracted from the auxiliary domain, which effectively addresses the small sample problem of micro-expression recognition. The proposed algorithm mainly consists of three parts. Firstly, the features of the two domains are projected into a common space and the dictionaries of each domain are studied respectively. Then the dictionary of micro-expression domain is linearly reconstructed. Finally, in order to improve the comprehensive utilization of feature information, the most representative features from four different micro-expression feature sets are selected by multiview representation. The experiments and evaluation are carried out on three different databases, and the performance comparison of the proposed algorithm with other advanced methods are given. The experimental results show that the proposed algorithm has the better performance than other related methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2932784