Loading…
Human action recognition with skeleton induced discriminative approximate rigid part model
•We present a discriminative approximate rigid part model which is induced by the skeleton.•Both the surface cue and the skeleton cue are fused together.•We present a new definition for the part model.•A novel sparsity induced feature selection approach is introduced.•Good experimental results are r...
Saved in:
Published in: | Pattern recognition letters 2016-11, Vol.83, p.261-267 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •We present a discriminative approximate rigid part model which is induced by the skeleton.•Both the surface cue and the skeleton cue are fused together.•We present a new definition for the part model.•A novel sparsity induced feature selection approach is introduced.•Good experimental results are reported on two famous action recognition datasets.
Human action recognition has a long research history. Despite various approaches have been designed in the last decades, it still remains challenging in computer vision and pattern recognition. In this paper, we present a skeleton induced discriminative approximate rigid part model for human action recognition, which not only captures the geometrical structure of human body, but also takes rich human body surface cues into consideration. In conventional approaches, the joint feature and the surface feature are discussed separately. While, in our proposed approach, the structural information is embedded into the surface model. In addition, to separate different approximate rigid parts generated from different human activities, a novel sparsity induced feature selection scheme is introduced. This scheme produces a discriminative feature subspace that can best separate different action classes. The presented approach is validated on two widely adopted benchmark datasets, i.e. the MSR Daily Activity 3D Dataset and the MSR Action 3D dataset. Experimental results demonstrate the effectiveness of our approach. |
---|---|
ISSN: | 0167-8655 1872-7344 |
DOI: | 10.1016/j.patrec.2016.07.025 |