Loading…

Dual-Biometric Human Identification Using Radar Deep Transfer Learning

Accurate human identification using radar has a variety of potential applications, such as surveillance, access control and security checkpoints. Nevertheless, radar-based human identification has been limited to a few motion-based biometrics that are solely reliant on micro-Doppler signatures. This...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Switzerland), 2022-08, Vol.22 (15), p.5782
Main Authors: Alkasimi, Ahmad, Shepard, Tyler, Wagner, Samuel, Pancrazio, Stephen, Pham, Anh-Vu, Gardner, Christopher, Funsten, Brad
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Accurate human identification using radar has a variety of potential applications, such as surveillance, access control and security checkpoints. Nevertheless, radar-based human identification has been limited to a few motion-based biometrics that are solely reliant on micro-Doppler signatures. This paper proposes for the first time the use of combined radar-based heart sound and gait signals as biometrics for human identification. The proposed methodology starts by converting the extracted biometric signatures collected from 18 subjects to images, and then an image augmentation technique is applied and the deep transfer learning is used to classify each subject. A validation accuracy of 58.7% and 96% is reported for the heart sound and gait biometrics, respectively. Next, the identification results of the two biometrics are combined using the joint probability mass function (PMF) method to report a 98% identification accuracy. To the best of our knowledge, this is the highest reported in the literature to date. Lastly, the trained networks are tested in an actual scenario while being used in an office access control platform to identify different human subjects. We report an accuracy of 76.25%.
ISSN:1424-8220
1424-8220
DOI:10.3390/s22155782