Loading…
Representative‐discriminative dictionary learning algorithm for human action recognition using smartphone sensors
SUMMARY With the advancement of mobile computing, understanding, and interpretation of human activities has become increasingly popular as an innovative human computer interaction application over the past few decades. This article presents a new scheme for action recognition based on sparse represe...
Saved in:
Published in: | Concurrency and computation 2023-01, Vol.35 (2), p.n/a |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | SUMMARY
With the advancement of mobile computing, understanding, and interpretation of human activities has become increasingly popular as an innovative human computer interaction application over the past few decades. This article presents a new scheme for action recognition based on sparse representation theory using a novel dictionary learning algorithm. This system employs two types of inertial signals from smartphones namely, accelerometer and gyroscope sensory data. Attainment of higher values of classification accuracy depends on the creation of effective dictionaries that completely retain the important features of every action while maintaining the least correlation with the features of other actions. Accordingly, in this research, we propose a new algorithm for learning dictionaries with two levels of dictionary training that aims at learning a compact, representative, and discriminative dictionary for each class. Unlike typical dictionary learning algorithms that aim at the creation of dictionaries that best represents the features of each class, our proposed algorithm incorporates a discriminative criterion that eventually produces better classification results. To validate the proposed framework, all the experiments were performed using three publicly available datasets. |
---|---|
ISSN: | 1532-0626 1532-0634 |
DOI: | 10.1002/cpe.7468 |