Loading…

LM-Net: a dynamic gesture recognition network with long-term aggregation and motion excitation

In recent years, there has been a growing interest in dynamic hand gestures as a natural means of human–computer interaction. However, existing methods for recognizing dynamic gestures have certain limitations, particularly in consistently capturing and focusing on the hand movement region across va...

Full description

Saved in:
Bibliographic Details
Published in:International journal of machine learning and cybernetics 2024-04, Vol.15 (4), p.1633-1645
Main Authors: Chang, Shaopeng, Huang, Xueyu
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-c298t-798bd063ad4359bcf06bc1999b6afca2f5c3beea44dc02edf0fd7df9030bdeea3
container_end_page 1645
container_issue 4
container_start_page 1633
container_title International journal of machine learning and cybernetics
container_volume 15
creator Chang, Shaopeng
Huang, Xueyu
description In recent years, there has been a growing interest in dynamic hand gestures as a natural means of human–computer interaction. However, existing methods for recognizing dynamic gestures have certain limitations, particularly in consistently capturing and focusing on the hand movement region across various motion patterns. This research paper presents LMNet, an innovative and efficacious network comprising the Long-term Aggregation Module and the Motion Excitation Module. The Motion Excitation Module exploits motion information extracted from neighboring frames to amplify motion-sensitive channels, while the Long-term Aggregation Module harnesses dynamic convolution to assimilate temporal information from diverse motion patterns. Rigorous experimentation conducted on the EgoGesture and Jester datasets demonstrates that LMNet surpasses the majority of prevailing approaches in terms of accuracy, while concurrently upholding an optimal computational cost.
doi_str_mv 10.1007/s13042-023-01987-3
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2942204562</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2942204562</sourcerecordid><originalsourceid>FETCH-LOGICAL-c298t-798bd063ad4359bcf06bc1999b6afca2f5c3beea44dc02edf0fd7df9030bdeea3</originalsourceid><addsrcrecordid>eNqFkE1LAzEQhoMoWGr_gKeA5-gk2a94k-IXVL0oeDJkk-y6tZvU7Jbaf2_cFb3pXGZ4ed4Z5kXomMIpBcjPOsohYQQYJ0BFkRO-hya0yApSQPG8_zPn9BDNum4JsTLgHNgEvSzuyL3tz7HCZudU22hc267fBIuD1b52Td94h53ttz684W3Tv-KVdzXpbWixqutgazUgyhnc-mG0H7rpB_UIHVRq1dnZd5-ip6vLx_kNWTxc384vFkQzUfQkF0VpIOPKJDwVpa4gKzUVQpSZqrRiVap5aa1KEqOBWVNBZXJTCeBQmqjzKToZ966Df9_EB-TSb4KLJyUTCWOQpBn7h6IppUykkWIjpYPvumAruQ5Nq8JOUpBfgcsxcBkDl0PgkkcTH01dhF1tw-_qP1yfHhyElA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2941511295</pqid></control><display><type>article</type><title>LM-Net: a dynamic gesture recognition network with long-term aggregation and motion excitation</title><source>Springer Nature</source><creator>Chang, Shaopeng ; Huang, Xueyu</creator><creatorcontrib>Chang, Shaopeng ; Huang, Xueyu</creatorcontrib><description>In recent years, there has been a growing interest in dynamic hand gestures as a natural means of human–computer interaction. However, existing methods for recognizing dynamic gestures have certain limitations, particularly in consistently capturing and focusing on the hand movement region across various motion patterns. This research paper presents LMNet, an innovative and efficacious network comprising the Long-term Aggregation Module and the Motion Excitation Module. The Motion Excitation Module exploits motion information extracted from neighboring frames to amplify motion-sensitive channels, while the Long-term Aggregation Module harnesses dynamic convolution to assimilate temporal information from diverse motion patterns. Rigorous experimentation conducted on the EgoGesture and Jester datasets demonstrates that LMNet surpasses the majority of prevailing approaches in terms of accuracy, while concurrently upholding an optimal computational cost.</description><identifier>ISSN: 1868-8071</identifier><identifier>EISSN: 1868-808X</identifier><identifier>DOI: 10.1007/s13042-023-01987-3</identifier><language>eng</language><publisher>Berlin/Heidelberg: Springer Berlin Heidelberg</publisher><subject>Artificial Intelligence ; Complex Systems ; Computational Intelligence ; Control ; Deep learning ; Engineering ; Excitation ; Gesture recognition ; Harnesses ; Machine learning ; Mechatronics ; Modules ; Movement ; Neural networks ; Original Article ; Pattern Recognition ; Robotics ; Systems Biology</subject><ispartof>International journal of machine learning and cybernetics, 2024-04, Vol.15 (4), p.1633-1645</ispartof><rights>The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c298t-798bd063ad4359bcf06bc1999b6afca2f5c3beea44dc02edf0fd7df9030bdeea3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Chang, Shaopeng</creatorcontrib><creatorcontrib>Huang, Xueyu</creatorcontrib><title>LM-Net: a dynamic gesture recognition network with long-term aggregation and motion excitation</title><title>International journal of machine learning and cybernetics</title><addtitle>Int. J. Mach. Learn. &amp; Cyber</addtitle><description>In recent years, there has been a growing interest in dynamic hand gestures as a natural means of human–computer interaction. However, existing methods for recognizing dynamic gestures have certain limitations, particularly in consistently capturing and focusing on the hand movement region across various motion patterns. This research paper presents LMNet, an innovative and efficacious network comprising the Long-term Aggregation Module and the Motion Excitation Module. The Motion Excitation Module exploits motion information extracted from neighboring frames to amplify motion-sensitive channels, while the Long-term Aggregation Module harnesses dynamic convolution to assimilate temporal information from diverse motion patterns. Rigorous experimentation conducted on the EgoGesture and Jester datasets demonstrates that LMNet surpasses the majority of prevailing approaches in terms of accuracy, while concurrently upholding an optimal computational cost.</description><subject>Artificial Intelligence</subject><subject>Complex Systems</subject><subject>Computational Intelligence</subject><subject>Control</subject><subject>Deep learning</subject><subject>Engineering</subject><subject>Excitation</subject><subject>Gesture recognition</subject><subject>Harnesses</subject><subject>Machine learning</subject><subject>Mechatronics</subject><subject>Modules</subject><subject>Movement</subject><subject>Neural networks</subject><subject>Original Article</subject><subject>Pattern Recognition</subject><subject>Robotics</subject><subject>Systems Biology</subject><issn>1868-8071</issn><issn>1868-808X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNqFkE1LAzEQhoMoWGr_gKeA5-gk2a94k-IXVL0oeDJkk-y6tZvU7Jbaf2_cFb3pXGZ4ed4Z5kXomMIpBcjPOsohYQQYJ0BFkRO-hya0yApSQPG8_zPn9BDNum4JsTLgHNgEvSzuyL3tz7HCZudU22hc267fBIuD1b52Td94h53ttz684W3Tv-KVdzXpbWixqutgazUgyhnc-mG0H7rpB_UIHVRq1dnZd5-ip6vLx_kNWTxc384vFkQzUfQkF0VpIOPKJDwVpa4gKzUVQpSZqrRiVap5aa1KEqOBWVNBZXJTCeBQmqjzKToZ966Df9_EB-TSb4KLJyUTCWOQpBn7h6IppUykkWIjpYPvumAruQ5Nq8JOUpBfgcsxcBkDl0PgkkcTH01dhF1tw-_qP1yfHhyElA</recordid><startdate>20240401</startdate><enddate>20240401</enddate><creator>Chang, Shaopeng</creator><creator>Huang, Xueyu</creator><general>Springer Berlin Heidelberg</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>JQ2</scope></search><sort><creationdate>20240401</creationdate><title>LM-Net: a dynamic gesture recognition network with long-term aggregation and motion excitation</title><author>Chang, Shaopeng ; Huang, Xueyu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c298t-798bd063ad4359bcf06bc1999b6afca2f5c3beea44dc02edf0fd7df9030bdeea3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Artificial Intelligence</topic><topic>Complex Systems</topic><topic>Computational Intelligence</topic><topic>Control</topic><topic>Deep learning</topic><topic>Engineering</topic><topic>Excitation</topic><topic>Gesture recognition</topic><topic>Harnesses</topic><topic>Machine learning</topic><topic>Mechatronics</topic><topic>Modules</topic><topic>Movement</topic><topic>Neural networks</topic><topic>Original Article</topic><topic>Pattern Recognition</topic><topic>Robotics</topic><topic>Systems Biology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Chang, Shaopeng</creatorcontrib><creatorcontrib>Huang, Xueyu</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Computer Science Collection</collection><jtitle>International journal of machine learning and cybernetics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Chang, Shaopeng</au><au>Huang, Xueyu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>LM-Net: a dynamic gesture recognition network with long-term aggregation and motion excitation</atitle><jtitle>International journal of machine learning and cybernetics</jtitle><stitle>Int. J. Mach. Learn. &amp; Cyber</stitle><date>2024-04-01</date><risdate>2024</risdate><volume>15</volume><issue>4</issue><spage>1633</spage><epage>1645</epage><pages>1633-1645</pages><issn>1868-8071</issn><eissn>1868-808X</eissn><abstract>In recent years, there has been a growing interest in dynamic hand gestures as a natural means of human–computer interaction. However, existing methods for recognizing dynamic gestures have certain limitations, particularly in consistently capturing and focusing on the hand movement region across various motion patterns. This research paper presents LMNet, an innovative and efficacious network comprising the Long-term Aggregation Module and the Motion Excitation Module. The Motion Excitation Module exploits motion information extracted from neighboring frames to amplify motion-sensitive channels, while the Long-term Aggregation Module harnesses dynamic convolution to assimilate temporal information from diverse motion patterns. Rigorous experimentation conducted on the EgoGesture and Jester datasets demonstrates that LMNet surpasses the majority of prevailing approaches in terms of accuracy, while concurrently upholding an optimal computational cost.</abstract><cop>Berlin/Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/s13042-023-01987-3</doi><tpages>13</tpages></addata></record>
fulltext fulltext
identifier ISSN: 1868-8071
ispartof International journal of machine learning and cybernetics, 2024-04, Vol.15 (4), p.1633-1645
issn 1868-8071
1868-808X
language eng
recordid cdi_proquest_journals_2942204562
source Springer Nature
subjects Artificial Intelligence
Complex Systems
Computational Intelligence
Control
Deep learning
Engineering
Excitation
Gesture recognition
Harnesses
Machine learning
Mechatronics
Modules
Movement
Neural networks
Original Article
Pattern Recognition
Robotics
Systems Biology
title LM-Net: a dynamic gesture recognition network with long-term aggregation and motion excitation
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T12%3A00%3A55IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=LM-Net:%20a%20dynamic%20gesture%20recognition%20network%20with%20long-term%20aggregation%20and%20motion%20excitation&rft.jtitle=International%20journal%20of%20machine%20learning%20and%20cybernetics&rft.au=Chang,%20Shaopeng&rft.date=2024-04-01&rft.volume=15&rft.issue=4&rft.spage=1633&rft.epage=1645&rft.pages=1633-1645&rft.issn=1868-8071&rft.eissn=1868-808X&rft_id=info:doi/10.1007/s13042-023-01987-3&rft_dat=%3Cproquest_cross%3E2942204562%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c298t-798bd063ad4359bcf06bc1999b6afca2f5c3beea44dc02edf0fd7df9030bdeea3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2941511295&rft_id=info:pmid/&rfr_iscdi=true