Loading…

HARMamba: Efficient and Lightweight Wearable Sensor Human Activity Recognition Based on Bidirectional Mamba

Wearable sensor-based human activity recognition (HAR) is a critical research domain in activity perception. However, achieving high efficiency and long sequence recognition remains a challenge. Despite the extensive investigation of temporal deep learning models, such as CNNs, RNNs, and transformer...

Full description

Saved in:
Bibliographic Details
Published in:IEEE internet of things journal 2024-09, p.1-1
Main Authors: Li, Shuangjian, Zhu, Tao, Duan, Furong, Chen, Liming, Ning, Huansheng, Nugent, Christopher, Wan, Yaping
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 1
container_issue
container_start_page 1
container_title IEEE internet of things journal
container_volume
creator Li, Shuangjian
Zhu, Tao
Duan, Furong
Chen, Liming
Ning, Huansheng
Nugent, Christopher
Wan, Yaping
description Wearable sensor-based human activity recognition (HAR) is a critical research domain in activity perception. However, achieving high efficiency and long sequence recognition remains a challenge. Despite the extensive investigation of temporal deep learning models, such as CNNs, RNNs, and transformers, their extensive parameters often pose significant computational and memory constraints, rendering them less suitable for resource-constrained mobile health applications. This study introduces HARMamba, an innovative light-weight and versatile HAR architecture that combines selective bidirectional State Spaces Model and hardware-aware design. To optimize real-time resource consumption in practical scenarios, HARMamba employs linear recursive mechanisms and parameter discretization, allowing it to selectively focus on relevant input sequences while efficiently fusing scan and recompute operations. The model employs independent channels to process sensor data streams, dividing each channel into patches and appending classification tokens to the end of the sequence. It utilizes position embedding to represent the sequence order. The patch sequence is subsequently processed by HARMamba Block, and the classification head finally outputs the activity category. The HARMamba Block serves as the fundamental component of the HARMamba architecture, enabling the effective capture of more discriminative activity sequence features. HARMamba outperforms contemporary state-of-the-art frameworks, delivering comparable or better accuracy with significantly reducing computational and memory demands. It's effectiveness has been extensively validated on 4 publically available datasets namely PAMAP2, WISDM, UNIMIB SHAR and UCI. The F1 scores of HARMamba on the four datasets are 99.74%, 99.20%, 88.23% and 97.01%, respectively.
doi_str_mv 10.1109/JIOT.2024.3463405
format article
fullrecord <record><control><sourceid>crossref_ieee_</sourceid><recordid>TN_cdi_ieee_primary_10683697</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10683697</ieee_id><sourcerecordid>10_1109_JIOT_2024_3463405</sourcerecordid><originalsourceid>FETCH-LOGICAL-c1065-730c2e5da7d14506d2e921002847b8d74cfdff797a5a8aac3fd3e4cf75f1f4cb3</originalsourceid><addsrcrecordid>eNpNkN1OwkAQhTdGEwnyACZe7AsU939b75CgYDAkiPGyme7O4iq0pq0a3l4qXHAzc3IyZ07yEXLN2ZBzlt0-zRaroWBCDaUyUjF9RnpCCpsoY8T5ib4kg6b5YIztY5pnpkc-p6PlM2wLuKOTEKKLWLYUSk_ncf3e_mI36RtCDcUG6QuWTVXT6fcWSjpybfyJ7Y4u0VXrMraxKuk9NOhpJ6KPNbrOhA39r7giFwE2DQ6Ou09eHyar8TSZLx5n49E8cZwZnVjJnEDtwXquNDNeYCY4YyJVtki9VS74EGxmQUMK4GTwEvem1YEH5QrZJ_zw19VV09QY8q86bqHe5ZzlHbC8A5Z3wPIjsH3m5pCJiHhyb1JpMiv_AM-aaDA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>HARMamba: Efficient and Lightweight Wearable Sensor Human Activity Recognition Based on Bidirectional Mamba</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Li, Shuangjian ; Zhu, Tao ; Duan, Furong ; Chen, Liming ; Ning, Huansheng ; Nugent, Christopher ; Wan, Yaping</creator><creatorcontrib>Li, Shuangjian ; Zhu, Tao ; Duan, Furong ; Chen, Liming ; Ning, Huansheng ; Nugent, Christopher ; Wan, Yaping</creatorcontrib><description>Wearable sensor-based human activity recognition (HAR) is a critical research domain in activity perception. However, achieving high efficiency and long sequence recognition remains a challenge. Despite the extensive investigation of temporal deep learning models, such as CNNs, RNNs, and transformers, their extensive parameters often pose significant computational and memory constraints, rendering them less suitable for resource-constrained mobile health applications. This study introduces HARMamba, an innovative light-weight and versatile HAR architecture that combines selective bidirectional State Spaces Model and hardware-aware design. To optimize real-time resource consumption in practical scenarios, HARMamba employs linear recursive mechanisms and parameter discretization, allowing it to selectively focus on relevant input sequences while efficiently fusing scan and recompute operations. The model employs independent channels to process sensor data streams, dividing each channel into patches and appending classification tokens to the end of the sequence. It utilizes position embedding to represent the sequence order. The patch sequence is subsequently processed by HARMamba Block, and the classification head finally outputs the activity category. The HARMamba Block serves as the fundamental component of the HARMamba architecture, enabling the effective capture of more discriminative activity sequence features. HARMamba outperforms contemporary state-of-the-art frameworks, delivering comparable or better accuracy with significantly reducing computational and memory demands. It's effectiveness has been extensively validated on 4 publically available datasets namely PAMAP2, WISDM, UNIMIB SHAR and UCI. The F1 scores of HARMamba on the four datasets are 99.74%, 99.20%, 88.23% and 97.01%, respectively.</description><identifier>ISSN: 2327-4662</identifier><identifier>EISSN: 2327-4662</identifier><identifier>DOI: 10.1109/JIOT.2024.3463405</identifier><identifier>CODEN: IITJAU</identifier><language>eng</language><publisher>IEEE</publisher><subject>Computational modeling ; Context modeling ; Data models ; Deep learning ; Human activity recognition ; Light-weight ; Selective State Space Models ; Training ; Transformers ; Wearable Sensors</subject><ispartof>IEEE internet of things journal, 2024-09, p.1-1</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><orcidid>0000-0003-0882-7902 ; 0000-0001-9215-5488 ; 0000-0002-0017-9552 ; 0000-0002-5879-5980 ; 0000-0001-6413-193X ; 0009-0001-8024-7278</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10683697$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,54796</link.rule.ids></links><search><creatorcontrib>Li, Shuangjian</creatorcontrib><creatorcontrib>Zhu, Tao</creatorcontrib><creatorcontrib>Duan, Furong</creatorcontrib><creatorcontrib>Chen, Liming</creatorcontrib><creatorcontrib>Ning, Huansheng</creatorcontrib><creatorcontrib>Nugent, Christopher</creatorcontrib><creatorcontrib>Wan, Yaping</creatorcontrib><title>HARMamba: Efficient and Lightweight Wearable Sensor Human Activity Recognition Based on Bidirectional Mamba</title><title>IEEE internet of things journal</title><addtitle>JIoT</addtitle><description>Wearable sensor-based human activity recognition (HAR) is a critical research domain in activity perception. However, achieving high efficiency and long sequence recognition remains a challenge. Despite the extensive investigation of temporal deep learning models, such as CNNs, RNNs, and transformers, their extensive parameters often pose significant computational and memory constraints, rendering them less suitable for resource-constrained mobile health applications. This study introduces HARMamba, an innovative light-weight and versatile HAR architecture that combines selective bidirectional State Spaces Model and hardware-aware design. To optimize real-time resource consumption in practical scenarios, HARMamba employs linear recursive mechanisms and parameter discretization, allowing it to selectively focus on relevant input sequences while efficiently fusing scan and recompute operations. The model employs independent channels to process sensor data streams, dividing each channel into patches and appending classification tokens to the end of the sequence. It utilizes position embedding to represent the sequence order. The patch sequence is subsequently processed by HARMamba Block, and the classification head finally outputs the activity category. The HARMamba Block serves as the fundamental component of the HARMamba architecture, enabling the effective capture of more discriminative activity sequence features. HARMamba outperforms contemporary state-of-the-art frameworks, delivering comparable or better accuracy with significantly reducing computational and memory demands. It's effectiveness has been extensively validated on 4 publically available datasets namely PAMAP2, WISDM, UNIMIB SHAR and UCI. The F1 scores of HARMamba on the four datasets are 99.74%, 99.20%, 88.23% and 97.01%, respectively.</description><subject>Computational modeling</subject><subject>Context modeling</subject><subject>Data models</subject><subject>Deep learning</subject><subject>Human activity recognition</subject><subject>Light-weight</subject><subject>Selective State Space Models</subject><subject>Training</subject><subject>Transformers</subject><subject>Wearable Sensors</subject><issn>2327-4662</issn><issn>2327-4662</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNpNkN1OwkAQhTdGEwnyACZe7AsU939b75CgYDAkiPGyme7O4iq0pq0a3l4qXHAzc3IyZ07yEXLN2ZBzlt0-zRaroWBCDaUyUjF9RnpCCpsoY8T5ib4kg6b5YIztY5pnpkc-p6PlM2wLuKOTEKKLWLYUSk_ncf3e_mI36RtCDcUG6QuWTVXT6fcWSjpybfyJ7Y4u0VXrMraxKuk9NOhpJ6KPNbrOhA39r7giFwE2DQ6Ou09eHyar8TSZLx5n49E8cZwZnVjJnEDtwXquNDNeYCY4YyJVtki9VS74EGxmQUMK4GTwEvem1YEH5QrZJ_zw19VV09QY8q86bqHe5ZzlHbC8A5Z3wPIjsH3m5pCJiHhyb1JpMiv_AM-aaDA</recordid><startdate>20240917</startdate><enddate>20240917</enddate><creator>Li, Shuangjian</creator><creator>Zhu, Tao</creator><creator>Duan, Furong</creator><creator>Chen, Liming</creator><creator>Ning, Huansheng</creator><creator>Nugent, Christopher</creator><creator>Wan, Yaping</creator><general>IEEE</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0003-0882-7902</orcidid><orcidid>https://orcid.org/0000-0001-9215-5488</orcidid><orcidid>https://orcid.org/0000-0002-0017-9552</orcidid><orcidid>https://orcid.org/0000-0002-5879-5980</orcidid><orcidid>https://orcid.org/0000-0001-6413-193X</orcidid><orcidid>https://orcid.org/0009-0001-8024-7278</orcidid></search><sort><creationdate>20240917</creationdate><title>HARMamba: Efficient and Lightweight Wearable Sensor Human Activity Recognition Based on Bidirectional Mamba</title><author>Li, Shuangjian ; Zhu, Tao ; Duan, Furong ; Chen, Liming ; Ning, Huansheng ; Nugent, Christopher ; Wan, Yaping</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c1065-730c2e5da7d14506d2e921002847b8d74cfdff797a5a8aac3fd3e4cf75f1f4cb3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Computational modeling</topic><topic>Context modeling</topic><topic>Data models</topic><topic>Deep learning</topic><topic>Human activity recognition</topic><topic>Light-weight</topic><topic>Selective State Space Models</topic><topic>Training</topic><topic>Transformers</topic><topic>Wearable Sensors</topic><toplevel>online_resources</toplevel><creatorcontrib>Li, Shuangjian</creatorcontrib><creatorcontrib>Zhu, Tao</creatorcontrib><creatorcontrib>Duan, Furong</creatorcontrib><creatorcontrib>Chen, Liming</creatorcontrib><creatorcontrib>Ning, Huansheng</creatorcontrib><creatorcontrib>Nugent, Christopher</creatorcontrib><creatorcontrib>Wan, Yaping</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) Online</collection><collection>IEEE Xplore</collection><collection>CrossRef</collection><jtitle>IEEE internet of things journal</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Shuangjian</au><au>Zhu, Tao</au><au>Duan, Furong</au><au>Chen, Liming</au><au>Ning, Huansheng</au><au>Nugent, Christopher</au><au>Wan, Yaping</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>HARMamba: Efficient and Lightweight Wearable Sensor Human Activity Recognition Based on Bidirectional Mamba</atitle><jtitle>IEEE internet of things journal</jtitle><stitle>JIoT</stitle><date>2024-09-17</date><risdate>2024</risdate><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>2327-4662</issn><eissn>2327-4662</eissn><coden>IITJAU</coden><abstract>Wearable sensor-based human activity recognition (HAR) is a critical research domain in activity perception. However, achieving high efficiency and long sequence recognition remains a challenge. Despite the extensive investigation of temporal deep learning models, such as CNNs, RNNs, and transformers, their extensive parameters often pose significant computational and memory constraints, rendering them less suitable for resource-constrained mobile health applications. This study introduces HARMamba, an innovative light-weight and versatile HAR architecture that combines selective bidirectional State Spaces Model and hardware-aware design. To optimize real-time resource consumption in practical scenarios, HARMamba employs linear recursive mechanisms and parameter discretization, allowing it to selectively focus on relevant input sequences while efficiently fusing scan and recompute operations. The model employs independent channels to process sensor data streams, dividing each channel into patches and appending classification tokens to the end of the sequence. It utilizes position embedding to represent the sequence order. The patch sequence is subsequently processed by HARMamba Block, and the classification head finally outputs the activity category. The HARMamba Block serves as the fundamental component of the HARMamba architecture, enabling the effective capture of more discriminative activity sequence features. HARMamba outperforms contemporary state-of-the-art frameworks, delivering comparable or better accuracy with significantly reducing computational and memory demands. It's effectiveness has been extensively validated on 4 publically available datasets namely PAMAP2, WISDM, UNIMIB SHAR and UCI. The F1 scores of HARMamba on the four datasets are 99.74%, 99.20%, 88.23% and 97.01%, respectively.</abstract><pub>IEEE</pub><doi>10.1109/JIOT.2024.3463405</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0003-0882-7902</orcidid><orcidid>https://orcid.org/0000-0001-9215-5488</orcidid><orcidid>https://orcid.org/0000-0002-0017-9552</orcidid><orcidid>https://orcid.org/0000-0002-5879-5980</orcidid><orcidid>https://orcid.org/0000-0001-6413-193X</orcidid><orcidid>https://orcid.org/0009-0001-8024-7278</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2327-4662
ispartof IEEE internet of things journal, 2024-09, p.1-1
issn 2327-4662
2327-4662
language eng
recordid cdi_ieee_primary_10683697
source IEEE Electronic Library (IEL) Journals
subjects Computational modeling
Context modeling
Data models
Deep learning
Human activity recognition
Light-weight
Selective State Space Models
Training
Transformers
Wearable Sensors
title HARMamba: Efficient and Lightweight Wearable Sensor Human Activity Recognition Based on Bidirectional Mamba
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T04%3A56%3A56IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-crossref_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=HARMamba:%20Efficient%20and%20Lightweight%20Wearable%20Sensor%20Human%20Activity%20Recognition%20Based%20on%20Bidirectional%20Mamba&rft.jtitle=IEEE%20internet%20of%20things%20journal&rft.au=Li,%20Shuangjian&rft.date=2024-09-17&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=2327-4662&rft.eissn=2327-4662&rft.coden=IITJAU&rft_id=info:doi/10.1109/JIOT.2024.3463405&rft_dat=%3Ccrossref_ieee_%3E10_1109_JIOT_2024_3463405%3C/crossref_ieee_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c1065-730c2e5da7d14506d2e921002847b8d74cfdff797a5a8aac3fd3e4cf75f1f4cb3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10683697&rfr_iscdi=true