Loading…

PoseNormNet: Identity-Preserved Posture Normalization of 3-D Body Scans in Arbitrary Postures

Three-dimensional (3-D) human models accurately represent the shape of the subjects, which is key to many human-centric industrial applications, including fashion design, body biometrics extraction, and computer animation. These tasks usually require a high-fidelity human body mesh in a canonical po...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on industrial informatics 2023-11, Vol.19 (11), p.11298-11309
Main Authors: Zhao, Ran, Dai, Xinxin, Hu, Pengpeng, Munteanu, Adrian
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Three-dimensional (3-D) human models accurately represent the shape of the subjects, which is key to many human-centric industrial applications, including fashion design, body biometrics extraction, and computer animation. These tasks usually require a high-fidelity human body mesh in a canonical posture (e.g., “A” pose or “T” pose). Although 3-D scanning technology is fast and popular for acquiring the subject's body shape, automatically normalizing the posture of scanned bodies is still under-researched. Existing methods highly rely on skeleton-driven animation technologies. However, these methods require carefully designed skeleton and skin weights, which is time-consuming and fails when the initial posture is complicated. In this article, a novel deep learning-based approach, dubbed PoseNormNet, is proposed to automatically normalize the postures of scanned bodies. The proposed algorithm provides strong operability since it does not require any rigging priors and works well for subjects in arbitrary postures. Extensive experimental results on both synthetic and real-world datasets demonstrate that the proposed method achieves state-of-the-art performance in both objective and subjective terms.
ISSN:1551-3203
1941-0050
DOI:10.1109/TII.2023.3245682