Loading…
Visual-Inertial Fusion-Based Human Pose Estimation: A Review
Human pose estimation provides valuable information for biomedical research on human movement and applications such as entertainment and physical exercise. The fusion of visual and inertial data has been increasingly studied in the past two decades to take advantage of these two naturally complement...
Saved in:
Published in: | IEEE transactions on instrumentation and measurement 2023-01, Vol.72, p.1-1 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Human pose estimation provides valuable information for biomedical research on human movement and applications such as entertainment and physical exercise. The fusion of visual and inertial data has been increasingly studied in the past two decades to take advantage of these two naturally complementary sensing modalities. In this paper, we systematically reviewed the advances in visual-inertial fusion-based human pose estimation with a thorough search for related studies in five mainstream literature databases. A total of 54 studies were identified and included by screening 4586 records retrieved in the review process. The estimation targets, hardware design, fusion methods, evaluation metrics, and system accuracy of these included studies were summarized and categorized for analysis. From these state-of-the-art studies, challenges in terms of mobility, calibration, real-time estimation, and evaluation methods are further discussed in depth and possible directions to overcome these issues are recommended. We expect this systematic review can provide researchers and engineers with a thorough idea of the progress and performance in visual-inertial fusion-based human pose estimation. We also hope that the discussions on challenges and possible future directions can facilitate future work to improve such systems and promote their applications in real life. |
---|---|
ISSN: | 0018-9456 1557-9662 |
DOI: | 10.1109/TIM.2023.3286000 |