Loading…

Real-time glove and android application for visual and audible Arabic sign language translation

Researchers can develop new systems to capture, analyze, recognize, memorize and interpret hand gestures with machine learning and sensors. Acoustic communication is a way to convey human opinions, feelings, messages, and information. Deaf and mute individuals communicate using sign language that is...

Full description

Saved in:
Bibliographic Details
Published in:Procedia computer science 2019, Vol.163, p.450-459
Main Authors: Salem, Nema, Alharbi, Saja, Khezendar, Raghdah, Alshami, Hedaih
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Researchers can develop new systems to capture, analyze, recognize, memorize and interpret hand gestures with machine learning and sensors. Acoustic communication is a way to convey human opinions, feelings, messages, and information. Deaf and mute individuals communicate using sign language that is not understandable by everyone. Unfortunately, they face extreme difficulty in conveying their messages to others. To facilitate the communication between deaf/mute individuals and normal people, we propose a real-time prototype using a customized glove equipped with five flex and one-accelerometer sensors. These sensors are able to detect the bindings of the fingers and the movements of the hand. In addition, we developed an android mobile application to recognize the captured Arabic Sign Language (ArSL) gestures and translate them into displayed texts and audible sounds. The developed prototype is accurate, low cost and fast in response.
ISSN:1877-0509
1877-0509
DOI:10.1016/j.procs.2019.12.128