Loading…

Artificial Neural Network-Based Activities Classification, Gait Phase Estimation, and Prediction

Gait patterns are critical to health monitoring, gait impairment assessment, and wearable device control. Unrhythmic gait pattern detection under community-based conditions is a new frontier in this area. The present paper describes a high-accuracy gait phase estimation and prediction algorithm buil...

Full description

Saved in:
Bibliographic Details
Published in:Annals of biomedical engineering 2023-07, Vol.51 (7), p.1471-1484
Main Authors: Yu, Shuangyue, Yang, Jianfu, Huang, Tzu-Hao, Zhu, Junxi, Visco, Christopher J., Hameed, Farah, Stein, Joel, Zhou, Xianlian, Su, Hao
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c375t-b23172f23e70935e49a155f18be14a26a4bc4b1aebf7603b310c8cd879f1423c3
cites cdi_FETCH-LOGICAL-c375t-b23172f23e70935e49a155f18be14a26a4bc4b1aebf7603b310c8cd879f1423c3
container_end_page 1484
container_issue 7
container_start_page 1471
container_title Annals of biomedical engineering
container_volume 51
creator Yu, Shuangyue
Yang, Jianfu
Huang, Tzu-Hao
Zhu, Junxi
Visco, Christopher J.
Hameed, Farah
Stein, Joel
Zhou, Xianlian
Su, Hao
description Gait patterns are critical to health monitoring, gait impairment assessment, and wearable device control. Unrhythmic gait pattern detection under community-based conditions is a new frontier in this area. The present paper describes a high-accuracy gait phase estimation and prediction algorithm built on a two-stage artificial neural network. This work targets to develop an algorithm that can estimate and predict the gait cycle in real time using a portable controller with only two IMU sensors (one on each thigh) in the community setting. Our algorithm can detect the gait phase in unrhythmic conditions during walking, stair ascending, and stair descending, and classify these activities with standing. Moreover, our algorithm is able to predict both future intra- and inter-stride gait phases, offering a potential means to improve wearable device controller performance. The proposed data-driven algorithm is based on a dataset consisting of 5 able-bodied subjects and validated on 3 different able-bodied subjects. Under unrhythmic activity situations, validation shows that the algorithm can accurately identify multiple activities with 99.55% accuracy, and estimate ( rRMSE 0 : 6.3%) and predict 200-ms-ahead ( rRMSE 200 ms : 8.6%) the gait phase percentage in real time, which are on average 57.7 and 54.0% smaller than the error from the event-based method in the same conditions. This study showcases a solution to estimate and predict gait status for multiple unrhythmic activities, which may be deployed to controllers for wearable robots or health monitoring devices.
doi_str_mv 10.1007/s10439-023-03151-y
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2768239913</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2825585918</sourcerecordid><originalsourceid>FETCH-LOGICAL-c375t-b23172f23e70935e49a155f18be14a26a4bc4b1aebf7603b310c8cd879f1423c3</originalsourceid><addsrcrecordid>eNp9kE1v1DAQhi0EosvCH-CAInHhgMHjib-O21UpSBX0AGfjOA64ZJNiO6D993i7C0gcOI3seea15yHkKbBXwJh6nYG1aCjjSBmCALq_R1YgFFIjtbxPVowZRqWR7Rl5lPMNYwAaxUNyhlJqUK1Zkc-bVOIQfXRj8z4s6a6Un3P6Rs9dDn2z8SX-iCWG3GxHl_MBdiXO08vm0sXSXH-tWHORS9ydrt3UN9cp9NEfzo_Jg8GNOTw51TX59Obi4_Ytvfpw-W67uaIelSi04wiKDxyDYgZFaI0DIQbQXYDWcenazrcduNANSjLsEJjXvtfKDNBy9LgmL465t2n-voRc7C5mH8bRTWFesuVKao7GAFb0-T_ozbykqf7Ocs2F0MJUT2vCj5RPc84pDPY21R3T3gKzB__26N9W__bOv93XoWen6KXbhf7PyG_hFcAjkGtr-hLS37f_E_sLRU6QSQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2825585918</pqid></control><display><type>article</type><title>Artificial Neural Network-Based Activities Classification, Gait Phase Estimation, and Prediction</title><source>Springer Nature</source><creator>Yu, Shuangyue ; Yang, Jianfu ; Huang, Tzu-Hao ; Zhu, Junxi ; Visco, Christopher J. ; Hameed, Farah ; Stein, Joel ; Zhou, Xianlian ; Su, Hao</creator><creatorcontrib>Yu, Shuangyue ; Yang, Jianfu ; Huang, Tzu-Hao ; Zhu, Junxi ; Visco, Christopher J. ; Hameed, Farah ; Stein, Joel ; Zhou, Xianlian ; Su, Hao</creatorcontrib><description>Gait patterns are critical to health monitoring, gait impairment assessment, and wearable device control. Unrhythmic gait pattern detection under community-based conditions is a new frontier in this area. The present paper describes a high-accuracy gait phase estimation and prediction algorithm built on a two-stage artificial neural network. This work targets to develop an algorithm that can estimate and predict the gait cycle in real time using a portable controller with only two IMU sensors (one on each thigh) in the community setting. Our algorithm can detect the gait phase in unrhythmic conditions during walking, stair ascending, and stair descending, and classify these activities with standing. Moreover, our algorithm is able to predict both future intra- and inter-stride gait phases, offering a potential means to improve wearable device controller performance. The proposed data-driven algorithm is based on a dataset consisting of 5 able-bodied subjects and validated on 3 different able-bodied subjects. Under unrhythmic activity situations, validation shows that the algorithm can accurately identify multiple activities with 99.55% accuracy, and estimate ( rRMSE 0 : 6.3%) and predict 200-ms-ahead ( rRMSE 200 ms : 8.6%) the gait phase percentage in real time, which are on average 57.7 and 54.0% smaller than the error from the event-based method in the same conditions. This study showcases a solution to estimate and predict gait status for multiple unrhythmic activities, which may be deployed to controllers for wearable robots or health monitoring devices.</description><identifier>ISSN: 0090-6964</identifier><identifier>EISSN: 1573-9686</identifier><identifier>DOI: 10.1007/s10439-023-03151-y</identifier><identifier>PMID: 36681749</identifier><language>eng</language><publisher>Cham: Springer International Publishing</publisher><subject>Algorithms ; Artificial neural networks ; Biochemistry ; Biological and Medical Physics ; Biomedical and Life Sciences ; Biomedical Engineering and Bioengineering ; Biomedicine ; Biophysics ; Classical Mechanics ; Controllers ; Gait ; Neural networks ; Original Article ; Real time ; Thigh ; Wearable computers ; Wearable technology</subject><ispartof>Annals of biomedical engineering, 2023-07, Vol.51 (7), p.1471-1484</ispartof><rights>The Author(s) under exclusive licence to Biomedical Engineering Society 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><rights>2023. The Author(s) under exclusive licence to Biomedical Engineering Society.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c375t-b23172f23e70935e49a155f18be14a26a4bc4b1aebf7603b310c8cd879f1423c3</citedby><cites>FETCH-LOGICAL-c375t-b23172f23e70935e49a155f18be14a26a4bc4b1aebf7603b310c8cd879f1423c3</cites><orcidid>0000-0003-3299-7418</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27922,27923</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36681749$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Yu, Shuangyue</creatorcontrib><creatorcontrib>Yang, Jianfu</creatorcontrib><creatorcontrib>Huang, Tzu-Hao</creatorcontrib><creatorcontrib>Zhu, Junxi</creatorcontrib><creatorcontrib>Visco, Christopher J.</creatorcontrib><creatorcontrib>Hameed, Farah</creatorcontrib><creatorcontrib>Stein, Joel</creatorcontrib><creatorcontrib>Zhou, Xianlian</creatorcontrib><creatorcontrib>Su, Hao</creatorcontrib><title>Artificial Neural Network-Based Activities Classification, Gait Phase Estimation, and Prediction</title><title>Annals of biomedical engineering</title><addtitle>Ann Biomed Eng</addtitle><addtitle>Ann Biomed Eng</addtitle><description>Gait patterns are critical to health monitoring, gait impairment assessment, and wearable device control. Unrhythmic gait pattern detection under community-based conditions is a new frontier in this area. The present paper describes a high-accuracy gait phase estimation and prediction algorithm built on a two-stage artificial neural network. This work targets to develop an algorithm that can estimate and predict the gait cycle in real time using a portable controller with only two IMU sensors (one on each thigh) in the community setting. Our algorithm can detect the gait phase in unrhythmic conditions during walking, stair ascending, and stair descending, and classify these activities with standing. Moreover, our algorithm is able to predict both future intra- and inter-stride gait phases, offering a potential means to improve wearable device controller performance. The proposed data-driven algorithm is based on a dataset consisting of 5 able-bodied subjects and validated on 3 different able-bodied subjects. Under unrhythmic activity situations, validation shows that the algorithm can accurately identify multiple activities with 99.55% accuracy, and estimate ( rRMSE 0 : 6.3%) and predict 200-ms-ahead ( rRMSE 200 ms : 8.6%) the gait phase percentage in real time, which are on average 57.7 and 54.0% smaller than the error from the event-based method in the same conditions. This study showcases a solution to estimate and predict gait status for multiple unrhythmic activities, which may be deployed to controllers for wearable robots or health monitoring devices.</description><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Biochemistry</subject><subject>Biological and Medical Physics</subject><subject>Biomedical and Life Sciences</subject><subject>Biomedical Engineering and Bioengineering</subject><subject>Biomedicine</subject><subject>Biophysics</subject><subject>Classical Mechanics</subject><subject>Controllers</subject><subject>Gait</subject><subject>Neural networks</subject><subject>Original Article</subject><subject>Real time</subject><subject>Thigh</subject><subject>Wearable computers</subject><subject>Wearable technology</subject><issn>0090-6964</issn><issn>1573-9686</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp9kE1v1DAQhi0EosvCH-CAInHhgMHjib-O21UpSBX0AGfjOA64ZJNiO6D993i7C0gcOI3seea15yHkKbBXwJh6nYG1aCjjSBmCALq_R1YgFFIjtbxPVowZRqWR7Rl5lPMNYwAaxUNyhlJqUK1Zkc-bVOIQfXRj8z4s6a6Un3P6Rs9dDn2z8SX-iCWG3GxHl_MBdiXO08vm0sXSXH-tWHORS9ydrt3UN9cp9NEfzo_Jg8GNOTw51TX59Obi4_Ytvfpw-W67uaIelSi04wiKDxyDYgZFaI0DIQbQXYDWcenazrcduNANSjLsEJjXvtfKDNBy9LgmL465t2n-voRc7C5mH8bRTWFesuVKao7GAFb0-T_ozbykqf7Ocs2F0MJUT2vCj5RPc84pDPY21R3T3gKzB__26N9W__bOv93XoWen6KXbhf7PyG_hFcAjkGtr-hLS37f_E_sLRU6QSQ</recordid><startdate>20230701</startdate><enddate>20230701</enddate><creator>Yu, Shuangyue</creator><creator>Yang, Jianfu</creator><creator>Huang, Tzu-Hao</creator><creator>Zhu, Junxi</creator><creator>Visco, Christopher J.</creator><creator>Hameed, Farah</creator><creator>Stein, Joel</creator><creator>Zhou, Xianlian</creator><creator>Su, Hao</creator><general>Springer International Publishing</general><general>Springer Nature B.V</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QF</scope><scope>7QO</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7U5</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8BQ</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>F28</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H8D</scope><scope>H8G</scope><scope>HCIFZ</scope><scope>JG9</scope><scope>JQ2</scope><scope>K9.</scope><scope>KR7</scope><scope>L6V</scope><scope>L7M</scope><scope>LK8</scope><scope>L~C</scope><scope>L~D</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0003-3299-7418</orcidid></search><sort><creationdate>20230701</creationdate><title>Artificial Neural Network-Based Activities Classification, Gait Phase Estimation, and Prediction</title><author>Yu, Shuangyue ; Yang, Jianfu ; Huang, Tzu-Hao ; Zhu, Junxi ; Visco, Christopher J. ; Hameed, Farah ; Stein, Joel ; Zhou, Xianlian ; Su, Hao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c375t-b23172f23e70935e49a155f18be14a26a4bc4b1aebf7603b310c8cd879f1423c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Biochemistry</topic><topic>Biological and Medical Physics</topic><topic>Biomedical and Life Sciences</topic><topic>Biomedical Engineering and Bioengineering</topic><topic>Biomedicine</topic><topic>Biophysics</topic><topic>Classical Mechanics</topic><topic>Controllers</topic><topic>Gait</topic><topic>Neural networks</topic><topic>Original Article</topic><topic>Real time</topic><topic>Thigh</topic><topic>Wearable computers</topic><topic>Wearable technology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yu, Shuangyue</creatorcontrib><creatorcontrib>Yang, Jianfu</creatorcontrib><creatorcontrib>Huang, Tzu-Hao</creatorcontrib><creatorcontrib>Zhu, Junxi</creatorcontrib><creatorcontrib>Visco, Christopher J.</creatorcontrib><creatorcontrib>Hameed, Farah</creatorcontrib><creatorcontrib>Stein, Joel</creatorcontrib><creatorcontrib>Zhou, Xianlian</creatorcontrib><creatorcontrib>Su, Hao</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Health Medical collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Aerospace Database</collection><collection>Copper Technical Reference Library</collection><collection>SciTech Premium Collection (Proquest) (PQ_SDU_P3)</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Civil Engineering Abstracts</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Biological Sciences</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>PML(ProQuest Medical Library)</collection><collection>Biological Science Database</collection><collection>ProQuest Engineering Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>MEDLINE - Academic</collection><jtitle>Annals of biomedical engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Yu, Shuangyue</au><au>Yang, Jianfu</au><au>Huang, Tzu-Hao</au><au>Zhu, Junxi</au><au>Visco, Christopher J.</au><au>Hameed, Farah</au><au>Stein, Joel</au><au>Zhou, Xianlian</au><au>Su, Hao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Artificial Neural Network-Based Activities Classification, Gait Phase Estimation, and Prediction</atitle><jtitle>Annals of biomedical engineering</jtitle><stitle>Ann Biomed Eng</stitle><addtitle>Ann Biomed Eng</addtitle><date>2023-07-01</date><risdate>2023</risdate><volume>51</volume><issue>7</issue><spage>1471</spage><epage>1484</epage><pages>1471-1484</pages><issn>0090-6964</issn><eissn>1573-9686</eissn><abstract>Gait patterns are critical to health monitoring, gait impairment assessment, and wearable device control. Unrhythmic gait pattern detection under community-based conditions is a new frontier in this area. The present paper describes a high-accuracy gait phase estimation and prediction algorithm built on a two-stage artificial neural network. This work targets to develop an algorithm that can estimate and predict the gait cycle in real time using a portable controller with only two IMU sensors (one on each thigh) in the community setting. Our algorithm can detect the gait phase in unrhythmic conditions during walking, stair ascending, and stair descending, and classify these activities with standing. Moreover, our algorithm is able to predict both future intra- and inter-stride gait phases, offering a potential means to improve wearable device controller performance. The proposed data-driven algorithm is based on a dataset consisting of 5 able-bodied subjects and validated on 3 different able-bodied subjects. Under unrhythmic activity situations, validation shows that the algorithm can accurately identify multiple activities with 99.55% accuracy, and estimate ( rRMSE 0 : 6.3%) and predict 200-ms-ahead ( rRMSE 200 ms : 8.6%) the gait phase percentage in real time, which are on average 57.7 and 54.0% smaller than the error from the event-based method in the same conditions. This study showcases a solution to estimate and predict gait status for multiple unrhythmic activities, which may be deployed to controllers for wearable robots or health monitoring devices.</abstract><cop>Cham</cop><pub>Springer International Publishing</pub><pmid>36681749</pmid><doi>10.1007/s10439-023-03151-y</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0003-3299-7418</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0090-6964
ispartof Annals of biomedical engineering, 2023-07, Vol.51 (7), p.1471-1484
issn 0090-6964
1573-9686
language eng
recordid cdi_proquest_miscellaneous_2768239913
source Springer Nature
subjects Algorithms
Artificial neural networks
Biochemistry
Biological and Medical Physics
Biomedical and Life Sciences
Biomedical Engineering and Bioengineering
Biomedicine
Biophysics
Classical Mechanics
Controllers
Gait
Neural networks
Original Article
Real time
Thigh
Wearable computers
Wearable technology
title Artificial Neural Network-Based Activities Classification, Gait Phase Estimation, and Prediction
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-13T12%3A33%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Artificial%20Neural%20Network-Based%20Activities%20Classification,%20Gait%20Phase%20Estimation,%20and%20Prediction&rft.jtitle=Annals%20of%20biomedical%20engineering&rft.au=Yu,%20Shuangyue&rft.date=2023-07-01&rft.volume=51&rft.issue=7&rft.spage=1471&rft.epage=1484&rft.pages=1471-1484&rft.issn=0090-6964&rft.eissn=1573-9686&rft_id=info:doi/10.1007/s10439-023-03151-y&rft_dat=%3Cproquest_cross%3E2825585918%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c375t-b23172f23e70935e49a155f18be14a26a4bc4b1aebf7603b310c8cd879f1423c3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2825585918&rft_id=info:pmid/36681749&rfr_iscdi=true