Loading…

Fused features mining for depth-based hand gesture recognition to classify blind human communication

Gesture recognition and hand pose tracking are applicable techniques in human–computer interaction fields. Depth data obtained by depth cameras present a very informative explanation of the body or in particular hand pose that it can be used for more accurate gesture recognition systems. The hand de...

Full description

Saved in:
Bibliographic Details
Published in:Neural computing & applications 2017-11, Vol.28 (11), p.3285-3294
Main Authors: Jadooki, Saba, Mohamad, Dzulkifli, Saba, Tanzila, Almazyad, Abdulaziz S., Rehman, Amjad
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Gesture recognition and hand pose tracking are applicable techniques in human–computer interaction fields. Depth data obtained by depth cameras present a very informative explanation of the body or in particular hand pose that it can be used for more accurate gesture recognition systems. The hand detection and feature extraction process are very challenging task in the RGB images that they can be effectively dissolved with simple ways with depth data. However, depth data could be combined with the color information for more reliable recognition. A common hand gesture recognition system requires identifying the hand and its position or direction, extracting some useful features and applying a suitable machine-learning method to detect the performed gesture. This paper presents the novel fusion of the enhanced features for the classification of static signs of the sign language. It begins by explaining how the hand can be separated from the scene by depth data. Then, a combination feature extraction method is introduced for extracting some appropriate features of the images. Finally, an artificial neural network classifier is trained with these fused features and applied to critically analyze various descriptors performance.
ISSN:0941-0643
1433-3058
DOI:10.1007/s00521-016-2244-5