Loading…
ReSGait: The Real-Scene Gait Dataset
Many studies have shown that gait recognition can be used to identify humans at a long distance, with promising results on current datasets. However, those datasets are collected under controlled situations and predefined conditions, which limits the extrapolation of the results to unconstrained sit...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Citations: | Items that cite this one |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Many studies have shown that gait recognition can be used to identify humans at a long distance, with promising results on current datasets. However, those datasets are collected under controlled situations and predefined conditions, which limits the extrapolation of the results to unconstrained situations in which the subjects walk freely in scenes. To cover this gap, we release a novel real-scene gait dataset (ReSGait), which is the first dataset collected in unconstrained scenarios with freely moving subjects and not controlled environmental parameters. Overall, our dataset is composed of 172 subjects and 870 video sequences, recorded over 15 months. Video sequences are labeled with gender, clothing, carrying conditions, taken walking route, and whether mobile phones were used or not. Therefore, the main characteristics of our dataset that differentiate it from other datasets are as follows: (i) uncontrolled real-life scenes and (ii) long recording time. Finally, we empirically assess the difficulty of the proposed dataset by evaluating state-of-the-art gait approaches for silhouette and pose modalities. The results reveal an accuracy of less than 35%, showing the inherent level of difficulty of our dataset compared to other current datasets, in which accuracies are higher than 90%. Thus, our proposed dataset establishes a new level of difficulty in the gait recognition problem, much closer to real life. |
---|---|
ISSN: | 2474-9699 |
DOI: | 10.1109/IJCB52358.2021.9484347 |