Loading…

The validity of a dual Azure Kinect-based motion capture system for gait analysis: a preliminary study

The Microsoft Kinect can track human motions in various motor tasks. The recently released Azure Kinect is reported to have an improved image sensing technology. However, the validity of this newest sensor for gait analysis is still unknown. In this study, a dual Azure Kinect-based motion capture sy...

Full description

Saved in:
Bibliographic Details
Main Authors: Ma, Yunru, Sheng, Bo, Hart, Rylea, Zhang, Yanxin
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The Microsoft Kinect can track human motions in various motor tasks. The recently released Azure Kinect is reported to have an improved image sensing technology. However, the validity of this newest sensor for gait analysis is still unknown. In this study, a dual Azure Kinect-based motion capture system was developed. Gait analysis was conducted with five healthy adults. Joint angles calculated by this system were compared with that acquired by the Vicon motion capture system. The coefficient of multiple correlations (CMC) and root mean square errors (RMSE) were computed. The dual Azure Kinect system could provide accurate knee angles (CMC=0.87±0.06, RMSE=11.9°±3.4°). Hip sagittal angles demonstrated moderate agreement with the reference (CMC=0.60±0.34, RMSE=15.1°±6.5°). The hip frontal, transversal, and ankle angles demonstrated poor validity. Although levels of accuracy for each joint varied, the dual Azure Kinect system demonstrated an overall improved validity than the Kinect V2. Future studies should involve more participants and patient populations, and compare different versions of sensors in the same experimental setup simultaneously to warrant the findings derived from this study. Furthermore, it is also necessary to standardize the experimental setup and involve more sensors to provide adequate depth images for analysis.
ISSN:2640-0103