Loading…
A Unified Local-Global Feature Extraction Network for Human Gait Recognition Using Smartphone Sensors
Smartphone-based gait recognition has been considered a unique and promising technique for biometric-based identification. It is integrated with multiple sensors to collect inertial data while a person walks. However, captured data may be affected by several covariate factors due to variations of ga...
Saved in:
Published in: | Sensors (Basel, Switzerland) Switzerland), 2022-05, Vol.22 (11), p.3968 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Smartphone-based gait recognition has been considered a unique and promising technique for biometric-based identification. It is integrated with multiple sensors to collect inertial data while a person walks. However, captured data may be affected by several covariate factors due to variations of gait sequences such as holding loads, wearing types, shoe types, etc. Recent gait recognition approaches either work on global or local features, causing failure to handle these covariate-based features. To address these issues, a novel weighted multi-scale CNN (WMsCNN) architecture is designed to extract local to global features for boosting recognition accuracy. Specifically, a weight update sub-network (Ws) is proposed to increase or reduce the weights of features concerning their contribution to the final classification task. Thus, the sensitivity of these features toward the covariate factors decreases using the weight updated technique. Later, these features are fed to a fusion module used to produce global features for the overall classification. Extensive experiments have been conducted on four different benchmark datasets, and the demonstrated results of the proposed model are superior to other state-of-the-art deep learning approaches. |
---|---|
ISSN: | 1424-8220 1424-8220 |
DOI: | 10.3390/s22113968 |