Loading…

Recognition of static hand gestures of Indian sign language using CNN

Sign languages are natural languages used by hearing impaired people which use several means of expression for communication in day to day life. It relates letters, words, and sentences of a spoken language to gesticulations, enabling them to communicate among themselves. The deaf community can inte...

Full description

Saved in:
Bibliographic Details
Main Authors: Reshna, S., Sajeena, A., Jayaraju, M.
Format: Conference Proceeding
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c173t-999944344a37c30a563ef26dfe853695959a6723549aaab583f311f87ba7f2853
cites
container_end_page
container_issue 1
container_start_page
container_title
container_volume 2222
creator Reshna, S.
Sajeena, A.
Jayaraju, M.
description Sign languages are natural languages used by hearing impaired people which use several means of expression for communication in day to day life. It relates letters, words, and sentences of a spoken language to gesticulations, enabling them to communicate among themselves. The deaf community can interact with normal people with an automation system that can associate signs to the words of speech. This will support them to enhance their abilities and make them aware of doing better for the mankind. A vision based system that provides a feasible solution to Indian Sign Language (ISL) recognition of static gestures is presented in this paper. The proposed method doesn’t require that signers wear gloves or any other marker devices to simplify the process of hand segmenting. After modeling and analysis of the input hand image, classification method is used to recognize the sign. The classification is done using Computational Neural networks(CNN). Detection using CNN is rugged to distortions such as change in shape due to camera lens, different lighting conditions, various poses, presence of occlusions, horizontal and vertical shifts, etc. We are able to recognize 5 ISL gestures with a recognition accuracy of 90.55 %.
doi_str_mv 10.1063/5.0004485
format conference_proceeding
fullrecord <record><control><sourceid>proquest_scita</sourceid><recordid>TN_cdi_proquest_journals_2390334735</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2390334735</sourcerecordid><originalsourceid>FETCH-LOGICAL-c173t-999944344a37c30a563ef26dfe853695959a6723549aaab583f311f87ba7f2853</originalsourceid><addsrcrecordid>eNotkE9Lw0AQxRdRsFYPfoMFb0Lqbmb_ZI8Sai2UCqLgbZkmu3FL3dRscvDbm9DOHAaGH2_ePELuOVtwpuBJLhhjQhTygsy4lDzTiqtLMmPMiCwX8HVNblLaM5YbrYsZWb67qm1i6EMbaetp6rEPFf3GWNPGpX7oXJr261gHjDSFJtIDxmbAxtEhhdjQcru9JVceD8ndneecfL4sP8rXbPO2WpfPm6ziGvrMjCUECIGgK2AoFTifq9q7QoIycmxUOgcpDCLuZAEeOPeF3qH2-cjMycNJ99i1v8Noz-7boYvjSZuDYQBCw0Q9nqhUhemdNtpjF36w-7Oc2SkmK-05JvgHwitXdg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype><pqid>2390334735</pqid></control><display><type>conference_proceeding</type><title>Recognition of static hand gestures of Indian sign language using CNN</title><source>American Institute of Physics:Jisc Collections:Transitional Journals Agreement 2021-23 (Reading list)</source><creator>Reshna, S. ; Sajeena, A. ; Jayaraju, M.</creator><contributor>Hameed, T. A. Shahul ; Kunju, Nissan</contributor><creatorcontrib>Reshna, S. ; Sajeena, A. ; Jayaraju, M. ; Hameed, T. A. Shahul ; Kunju, Nissan</creatorcontrib><description>Sign languages are natural languages used by hearing impaired people which use several means of expression for communication in day to day life. It relates letters, words, and sentences of a spoken language to gesticulations, enabling them to communicate among themselves. The deaf community can interact with normal people with an automation system that can associate signs to the words of speech. This will support them to enhance their abilities and make them aware of doing better for the mankind. A vision based system that provides a feasible solution to Indian Sign Language (ISL) recognition of static gestures is presented in this paper. The proposed method doesn’t require that signers wear gloves or any other marker devices to simplify the process of hand segmenting. After modeling and analysis of the input hand image, classification method is used to recognize the sign. The classification is done using Computational Neural networks(CNN). Detection using CNN is rugged to distortions such as change in shape due to camera lens, different lighting conditions, various poses, presence of occlusions, horizontal and vertical shifts, etc. We are able to recognize 5 ISL gestures with a recognition accuracy of 90.55 %.</description><identifier>ISSN: 0094-243X</identifier><identifier>EISSN: 1551-7616</identifier><identifier>DOI: 10.1063/5.0004485</identifier><identifier>CODEN: APCPCS</identifier><language>eng</language><publisher>Melville: American Institute of Physics</publisher><subject>Gloves ; Human communication ; Image classification ; Languages ; Neural networks ; Recognition ; Sentences ; Sign language ; Vision systems ; Words (language)</subject><ispartof>AIP conference proceedings, 2020, Vol.2222 (1)</ispartof><rights>Author(s)</rights><rights>2020 Author(s). Published by AIP Publishing.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c173t-999944344a37c30a563ef26dfe853695959a6723549aaab583f311f87ba7f2853</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>309,310,314,777,781,786,787,23911,23912,25121,27905,27906</link.rule.ids></links><search><contributor>Hameed, T. A. Shahul</contributor><contributor>Kunju, Nissan</contributor><creatorcontrib>Reshna, S.</creatorcontrib><creatorcontrib>Sajeena, A.</creatorcontrib><creatorcontrib>Jayaraju, M.</creatorcontrib><title>Recognition of static hand gestures of Indian sign language using CNN</title><title>AIP conference proceedings</title><description>Sign languages are natural languages used by hearing impaired people which use several means of expression for communication in day to day life. It relates letters, words, and sentences of a spoken language to gesticulations, enabling them to communicate among themselves. The deaf community can interact with normal people with an automation system that can associate signs to the words of speech. This will support them to enhance their abilities and make them aware of doing better for the mankind. A vision based system that provides a feasible solution to Indian Sign Language (ISL) recognition of static gestures is presented in this paper. The proposed method doesn’t require that signers wear gloves or any other marker devices to simplify the process of hand segmenting. After modeling and analysis of the input hand image, classification method is used to recognize the sign. The classification is done using Computational Neural networks(CNN). Detection using CNN is rugged to distortions such as change in shape due to camera lens, different lighting conditions, various poses, presence of occlusions, horizontal and vertical shifts, etc. We are able to recognize 5 ISL gestures with a recognition accuracy of 90.55 %.</description><subject>Gloves</subject><subject>Human communication</subject><subject>Image classification</subject><subject>Languages</subject><subject>Neural networks</subject><subject>Recognition</subject><subject>Sentences</subject><subject>Sign language</subject><subject>Vision systems</subject><subject>Words (language)</subject><issn>0094-243X</issn><issn>1551-7616</issn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2020</creationdate><recordtype>conference_proceeding</recordtype><recordid>eNotkE9Lw0AQxRdRsFYPfoMFb0Lqbmb_ZI8Sai2UCqLgbZkmu3FL3dRscvDbm9DOHAaGH2_ePELuOVtwpuBJLhhjQhTygsy4lDzTiqtLMmPMiCwX8HVNblLaM5YbrYsZWb67qm1i6EMbaetp6rEPFf3GWNPGpX7oXJr261gHjDSFJtIDxmbAxtEhhdjQcru9JVceD8ndneecfL4sP8rXbPO2WpfPm6ziGvrMjCUECIGgK2AoFTifq9q7QoIycmxUOgcpDCLuZAEeOPeF3qH2-cjMycNJ99i1v8Noz-7boYvjSZuDYQBCw0Q9nqhUhemdNtpjF36w-7Oc2SkmK-05JvgHwitXdg</recordid><startdate>20200415</startdate><enddate>20200415</enddate><creator>Reshna, S.</creator><creator>Sajeena, A.</creator><creator>Jayaraju, M.</creator><general>American Institute of Physics</general><scope>8FD</scope><scope>H8D</scope><scope>L7M</scope></search><sort><creationdate>20200415</creationdate><title>Recognition of static hand gestures of Indian sign language using CNN</title><author>Reshna, S. ; Sajeena, A. ; Jayaraju, M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c173t-999944344a37c30a563ef26dfe853695959a6723549aaab583f311f87ba7f2853</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Gloves</topic><topic>Human communication</topic><topic>Image classification</topic><topic>Languages</topic><topic>Neural networks</topic><topic>Recognition</topic><topic>Sentences</topic><topic>Sign language</topic><topic>Vision systems</topic><topic>Words (language)</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Reshna, S.</creatorcontrib><creatorcontrib>Sajeena, A.</creatorcontrib><creatorcontrib>Jayaraju, M.</creatorcontrib><collection>Technology Research Database</collection><collection>Aerospace Database</collection><collection>Advanced Technologies Database with Aerospace</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Reshna, S.</au><au>Sajeena, A.</au><au>Jayaraju, M.</au><au>Hameed, T. A. Shahul</au><au>Kunju, Nissan</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Recognition of static hand gestures of Indian sign language using CNN</atitle><btitle>AIP conference proceedings</btitle><date>2020-04-15</date><risdate>2020</risdate><volume>2222</volume><issue>1</issue><issn>0094-243X</issn><eissn>1551-7616</eissn><coden>APCPCS</coden><abstract>Sign languages are natural languages used by hearing impaired people which use several means of expression for communication in day to day life. It relates letters, words, and sentences of a spoken language to gesticulations, enabling them to communicate among themselves. The deaf community can interact with normal people with an automation system that can associate signs to the words of speech. This will support them to enhance their abilities and make them aware of doing better for the mankind. A vision based system that provides a feasible solution to Indian Sign Language (ISL) recognition of static gestures is presented in this paper. The proposed method doesn’t require that signers wear gloves or any other marker devices to simplify the process of hand segmenting. After modeling and analysis of the input hand image, classification method is used to recognize the sign. The classification is done using Computational Neural networks(CNN). Detection using CNN is rugged to distortions such as change in shape due to camera lens, different lighting conditions, various poses, presence of occlusions, horizontal and vertical shifts, etc. We are able to recognize 5 ISL gestures with a recognition accuracy of 90.55 %.</abstract><cop>Melville</cop><pub>American Institute of Physics</pub><doi>10.1063/5.0004485</doi><tpages>7</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0094-243X
ispartof AIP conference proceedings, 2020, Vol.2222 (1)
issn 0094-243X
1551-7616
language eng
recordid cdi_proquest_journals_2390334735
source American Institute of Physics:Jisc Collections:Transitional Journals Agreement 2021-23 (Reading list)
subjects Gloves
Human communication
Image classification
Languages
Neural networks
Recognition
Sentences
Sign language
Vision systems
Words (language)
title Recognition of static hand gestures of Indian sign language using CNN
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-17T12%3A05%3A17IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_scita&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Recognition%20of%20static%20hand%20gestures%20of%20Indian%20sign%20language%20using%20CNN&rft.btitle=AIP%20conference%20proceedings&rft.au=Reshna,%20S.&rft.date=2020-04-15&rft.volume=2222&rft.issue=1&rft.issn=0094-243X&rft.eissn=1551-7616&rft.coden=APCPCS&rft_id=info:doi/10.1063/5.0004485&rft_dat=%3Cproquest_scita%3E2390334735%3C/proquest_scita%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c173t-999944344a37c30a563ef26dfe853695959a6723549aaab583f311f87ba7f2853%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2390334735&rft_id=info:pmid/&rfr_iscdi=true