Loading…

PatientHandNet: 3D Open-palm Hand Reconstruction from Sparse Multi-view Depth Images

Accurately reconstructing 3D hand shapes of patients is important for immobilization device customization, artificial limb generation, and hand disease diagnosis. Traditional 3D hand scanning requires multiple scans taken around the hand with a 3D scanning device. These methods require the patients...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on instrumentation and measurement 2023-08, p.1-1
Main Authors: Dai, Xinxin, Zhao, Ran, Hu, Pengpeng, Munteanu, Adrian
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Accurately reconstructing 3D hand shapes of patients is important for immobilization device customization, artificial limb generation, and hand disease diagnosis. Traditional 3D hand scanning requires multiple scans taken around the hand with a 3D scanning device. These methods require the patients to keep an open-palm posture during scanning, which is painful or even impossible for patients with impaired hand functions. Once multi-view partial point clouds are collected, expensive post-processing is necessary to generate a high-fidelity hand shape. To address these limitations, we propose a novel deep-learning method dubbed PatientHandNet to reconstruct high-fidelity hand shapes in a canonical open-palm pose from multiple-depth images acquired with a single-depth camera. The hand poses in the depth images may vary, hand movements are allowed, facilitating the 3D scanning process in particular for patients with difficult conditions. The proposed method has strong operability since it is insensitive to the input pose, allowing for pose variations in the input depth images. We also proposed two novel datasets: a large-scale synthetic dataset to train our model and a real-world dataset with ground-truth hand biometrics extracted by an experienced anthropometrist. Extensive experimental results on the unseen synthetic data and real-world data demonstrate that the proposed method provides robust and easy-to-use hand shape reconstruction and outperforms the state-of-the-art methods in biometric accuracy terms.
ISSN:0018-9456
DOI:10.1109/TIM.2023.3304706