Loading…

Human Activity Recognition Based on Acceleration Data From Smartphones Using HMMs

Smartphones are among the most popular wearable devices to monitor human activities. Several existing methods for Human Activity Recognition (HAR) using data from smartphones are based on conventional pattern recognition techniques, but they generate handcrafted feature vectors. This drawback is ove...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2021, Vol.9, p.139336-139351
Main Authors: Iloga, Sylvain, Bordat, Alexandre, Le Kernec, Julien, Romain, Olivier
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c442t-99d5a6e15d7f12cf248b0bc6674b8a2f093df2ab002bf760702259bd3f311f813
cites cdi_FETCH-LOGICAL-c442t-99d5a6e15d7f12cf248b0bc6674b8a2f093df2ab002bf760702259bd3f311f813
container_end_page 139351
container_issue
container_start_page 139336
container_title IEEE access
container_volume 9
creator Iloga, Sylvain
Bordat, Alexandre
Le Kernec, Julien
Romain, Olivier
description Smartphones are among the most popular wearable devices to monitor human activities. Several existing methods for Human Activity Recognition (HAR) using data from smartphones are based on conventional pattern recognition techniques, but they generate handcrafted feature vectors. This drawback is overcome by deep learning techniques which unfortunately require lots of computing resources, while generating less interpretable feature vectors. The current paper addresses these limitations through the proposal of a Hidden Markov Model (HMM)-based technique for HAR. More formally, the sequential variations of spatial locations within the raw data vectors are initially captured in Markov chains, which are later used for the initialization and the training of HMMs. Meta-data extracted from these models are then saved as the components of the feature vectors. The meta-data are related to the overall time spent by the model observing every symbol for a long time span, irrespective of the state from which this symbol is observed. Classification experiments involving four classification tasks have been carried out on the recently constructed UniMiB SHAR database which contains 17 classes, including 9 types of activities of daily living and 8 types of falls. As a result, the proposed approach has shown best accuracies between 92% and 98.85% for all the classification tasks. This performance is more than 10% better than prior work for 2 out of 4 classification tasks.
doi_str_mv 10.1109/ACCESS.2021.3117336
format article
fullrecord <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_70b59d2f8b9449feb9140802b9f12c7d</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9557268</ieee_id><doaj_id>oai_doaj_org_article_70b59d2f8b9449feb9140802b9f12c7d</doaj_id><sourcerecordid>2582247033</sourcerecordid><originalsourceid>FETCH-LOGICAL-c442t-99d5a6e15d7f12cf248b0bc6674b8a2f093df2ab002bf760702259bd3f311f813</originalsourceid><addsrcrecordid>eNpVkUtLAzEUhQdRUNRf4GbAlYvWm2TyWtb6qFARra5DMpPUlHZSk6ngvzfjiGg2CSfnfvdeTlGcIRgjBPJyMp3eLBZjDBiNCUKcELZXHGHE5IhQwvb_vA-L05RWkI_IEuVHxdNst9FtOak7_-G7z_LZ1mHZ-s6HtrzSyTZl6H9ru7ZRf6vXutPlbQybcrHRsdu-hdam8jX5dlnOHh7SSXHg9DrZ05_7uHi9vXmZzkbzx7v76WQ-qqsKdyMpG6qZRbThDuHa4UoYMDVjvDJCYweSNA5rA4CN4ww4YEylaYjLOzqByHFxP3CboFdqG32e5lMF7dW3EOJS5fF8vbaKg6GywU4YWVXSWSNRBSKDZd-aN5l1MbDe9PofajaZq14DIoBxAR993_PBu43hfWdTp1ZhF9u8qsJUYFxxICS7yOCqY0gpWveLRaD62NQQm-pjUz-x5aqzocpba38rJKUcM0G-AMnVkLM</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2582247033</pqid></control><display><type>article</type><title>Human Activity Recognition Based on Acceleration Data From Smartphones Using HMMs</title><source>IEEE Open Access Journals</source><creator>Iloga, Sylvain ; Bordat, Alexandre ; Le Kernec, Julien ; Romain, Olivier</creator><creatorcontrib>Iloga, Sylvain ; Bordat, Alexandre ; Le Kernec, Julien ; Romain, Olivier</creatorcontrib><description>Smartphones are among the most popular wearable devices to monitor human activities. Several existing methods for Human Activity Recognition (HAR) using data from smartphones are based on conventional pattern recognition techniques, but they generate handcrafted feature vectors. This drawback is overcome by deep learning techniques which unfortunately require lots of computing resources, while generating less interpretable feature vectors. The current paper addresses these limitations through the proposal of a Hidden Markov Model (HMM)-based technique for HAR. More formally, the sequential variations of spatial locations within the raw data vectors are initially captured in Markov chains, which are later used for the initialization and the training of HMMs. Meta-data extracted from these models are then saved as the components of the feature vectors. The meta-data are related to the overall time spent by the model observing every symbol for a long time span, irrespective of the state from which this symbol is observed. Classification experiments involving four classification tasks have been carried out on the recently constructed UniMiB SHAR database which contains 17 classes, including 9 types of activities of daily living and 8 types of falls. As a result, the proposed approach has shown best accuracies between 92% and 98.85% for all the classification tasks. This performance is more than 10% better than prior work for 2 out of 4 classification tasks.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2021.3117336</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>activities of daily living ; Activity recognition ; Classification ; Computer Science ; Deep learning ; Engineering Sciences ; fall detection ; Feature extraction ; Hidden Markov models ; Human activity recognition ; Image Processing ; Machine learning ; Markov chains ; Moving object recognition ; Pattern recognition ; Sensors ; Signal and Image processing ; Smart phones ; smartphone sensors ; Smartphones ; Support vector machines ; Wearable technology</subject><ispartof>IEEE access, 2021, Vol.9, p.139336-139351</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021</rights><rights>Distributed under a Creative Commons Attribution 4.0 International License</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c442t-99d5a6e15d7f12cf248b0bc6674b8a2f093df2ab002bf760702259bd3f311f813</citedby><cites>FETCH-LOGICAL-c442t-99d5a6e15d7f12cf248b0bc6674b8a2f093df2ab002bf760702259bd3f311f813</cites><orcidid>0000-0003-2124-6803 ; 0000-0002-4603-7744 ; 0000-0002-2172-1865 ; 0000-0002-1888-7696</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9557268$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>230,314,780,784,885,4024,27633,27923,27924,27925,54933</link.rule.ids><backlink>$$Uhttps://hal.science/hal-03806780$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>Iloga, Sylvain</creatorcontrib><creatorcontrib>Bordat, Alexandre</creatorcontrib><creatorcontrib>Le Kernec, Julien</creatorcontrib><creatorcontrib>Romain, Olivier</creatorcontrib><title>Human Activity Recognition Based on Acceleration Data From Smartphones Using HMMs</title><title>IEEE access</title><addtitle>Access</addtitle><description>Smartphones are among the most popular wearable devices to monitor human activities. Several existing methods for Human Activity Recognition (HAR) using data from smartphones are based on conventional pattern recognition techniques, but they generate handcrafted feature vectors. This drawback is overcome by deep learning techniques which unfortunately require lots of computing resources, while generating less interpretable feature vectors. The current paper addresses these limitations through the proposal of a Hidden Markov Model (HMM)-based technique for HAR. More formally, the sequential variations of spatial locations within the raw data vectors are initially captured in Markov chains, which are later used for the initialization and the training of HMMs. Meta-data extracted from these models are then saved as the components of the feature vectors. The meta-data are related to the overall time spent by the model observing every symbol for a long time span, irrespective of the state from which this symbol is observed. Classification experiments involving four classification tasks have been carried out on the recently constructed UniMiB SHAR database which contains 17 classes, including 9 types of activities of daily living and 8 types of falls. As a result, the proposed approach has shown best accuracies between 92% and 98.85% for all the classification tasks. This performance is more than 10% better than prior work for 2 out of 4 classification tasks.</description><subject>activities of daily living</subject><subject>Activity recognition</subject><subject>Classification</subject><subject>Computer Science</subject><subject>Deep learning</subject><subject>Engineering Sciences</subject><subject>fall detection</subject><subject>Feature extraction</subject><subject>Hidden Markov models</subject><subject>Human activity recognition</subject><subject>Image Processing</subject><subject>Machine learning</subject><subject>Markov chains</subject><subject>Moving object recognition</subject><subject>Pattern recognition</subject><subject>Sensors</subject><subject>Signal and Image processing</subject><subject>Smart phones</subject><subject>smartphone sensors</subject><subject>Smartphones</subject><subject>Support vector machines</subject><subject>Wearable technology</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>DOA</sourceid><recordid>eNpVkUtLAzEUhQdRUNRf4GbAlYvWm2TyWtb6qFARra5DMpPUlHZSk6ngvzfjiGg2CSfnfvdeTlGcIRgjBPJyMp3eLBZjDBiNCUKcELZXHGHE5IhQwvb_vA-L05RWkI_IEuVHxdNst9FtOak7_-G7z_LZ1mHZ-s6HtrzSyTZl6H9ru7ZRf6vXutPlbQybcrHRsdu-hdam8jX5dlnOHh7SSXHg9DrZ05_7uHi9vXmZzkbzx7v76WQ-qqsKdyMpG6qZRbThDuHa4UoYMDVjvDJCYweSNA5rA4CN4ww4YEylaYjLOzqByHFxP3CboFdqG32e5lMF7dW3EOJS5fF8vbaKg6GywU4YWVXSWSNRBSKDZd-aN5l1MbDe9PofajaZq14DIoBxAR993_PBu43hfWdTp1ZhF9u8qsJUYFxxICS7yOCqY0gpWveLRaD62NQQm-pjUz-x5aqzocpba38rJKUcM0G-AMnVkLM</recordid><startdate>2021</startdate><enddate>2021</enddate><creator>Iloga, Sylvain</creator><creator>Bordat, Alexandre</creator><creator>Le Kernec, Julien</creator><creator>Romain, Olivier</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>1XC</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0003-2124-6803</orcidid><orcidid>https://orcid.org/0000-0002-4603-7744</orcidid><orcidid>https://orcid.org/0000-0002-2172-1865</orcidid><orcidid>https://orcid.org/0000-0002-1888-7696</orcidid></search><sort><creationdate>2021</creationdate><title>Human Activity Recognition Based on Acceleration Data From Smartphones Using HMMs</title><author>Iloga, Sylvain ; Bordat, Alexandre ; Le Kernec, Julien ; Romain, Olivier</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c442t-99d5a6e15d7f12cf248b0bc6674b8a2f093df2ab002bf760702259bd3f311f813</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>activities of daily living</topic><topic>Activity recognition</topic><topic>Classification</topic><topic>Computer Science</topic><topic>Deep learning</topic><topic>Engineering Sciences</topic><topic>fall detection</topic><topic>Feature extraction</topic><topic>Hidden Markov models</topic><topic>Human activity recognition</topic><topic>Image Processing</topic><topic>Machine learning</topic><topic>Markov chains</topic><topic>Moving object recognition</topic><topic>Pattern recognition</topic><topic>Sensors</topic><topic>Signal and Image processing</topic><topic>Smart phones</topic><topic>smartphone sensors</topic><topic>Smartphones</topic><topic>Support vector machines</topic><topic>Wearable technology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Iloga, Sylvain</creatorcontrib><creatorcontrib>Bordat, Alexandre</creatorcontrib><creatorcontrib>Le Kernec, Julien</creatorcontrib><creatorcontrib>Romain, Olivier</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Xplore</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Hyper Article en Ligne (HAL)</collection><collection>Directory of Open Access Journals (Open Access)</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Iloga, Sylvain</au><au>Bordat, Alexandre</au><au>Le Kernec, Julien</au><au>Romain, Olivier</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Human Activity Recognition Based on Acceleration Data From Smartphones Using HMMs</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2021</date><risdate>2021</risdate><volume>9</volume><spage>139336</spage><epage>139351</epage><pages>139336-139351</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>Smartphones are among the most popular wearable devices to monitor human activities. Several existing methods for Human Activity Recognition (HAR) using data from smartphones are based on conventional pattern recognition techniques, but they generate handcrafted feature vectors. This drawback is overcome by deep learning techniques which unfortunately require lots of computing resources, while generating less interpretable feature vectors. The current paper addresses these limitations through the proposal of a Hidden Markov Model (HMM)-based technique for HAR. More formally, the sequential variations of spatial locations within the raw data vectors are initially captured in Markov chains, which are later used for the initialization and the training of HMMs. Meta-data extracted from these models are then saved as the components of the feature vectors. The meta-data are related to the overall time spent by the model observing every symbol for a long time span, irrespective of the state from which this symbol is observed. Classification experiments involving four classification tasks have been carried out on the recently constructed UniMiB SHAR database which contains 17 classes, including 9 types of activities of daily living and 8 types of falls. As a result, the proposed approach has shown best accuracies between 92% and 98.85% for all the classification tasks. This performance is more than 10% better than prior work for 2 out of 4 classification tasks.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2021.3117336</doi><tpages>16</tpages><orcidid>https://orcid.org/0000-0003-2124-6803</orcidid><orcidid>https://orcid.org/0000-0002-4603-7744</orcidid><orcidid>https://orcid.org/0000-0002-2172-1865</orcidid><orcidid>https://orcid.org/0000-0002-1888-7696</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2169-3536
ispartof IEEE access, 2021, Vol.9, p.139336-139351
issn 2169-3536
2169-3536
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_70b59d2f8b9449feb9140802b9f12c7d
source IEEE Open Access Journals
subjects activities of daily living
Activity recognition
Classification
Computer Science
Deep learning
Engineering Sciences
fall detection
Feature extraction
Hidden Markov models
Human activity recognition
Image Processing
Machine learning
Markov chains
Moving object recognition
Pattern recognition
Sensors
Signal and Image processing
Smart phones
smartphone sensors
Smartphones
Support vector machines
Wearable technology
title Human Activity Recognition Based on Acceleration Data From Smartphones Using HMMs
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-02T19%3A52%3A29IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Human%20Activity%20Recognition%20Based%20on%20Acceleration%20Data%20From%20Smartphones%20Using%20HMMs&rft.jtitle=IEEE%20access&rft.au=Iloga,%20Sylvain&rft.date=2021&rft.volume=9&rft.spage=139336&rft.epage=139351&rft.pages=139336-139351&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2021.3117336&rft_dat=%3Cproquest_doaj_%3E2582247033%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c442t-99d5a6e15d7f12cf248b0bc6674b8a2f093df2ab002bf760702259bd3f311f813%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2582247033&rft_id=info:pmid/&rft_ieee_id=9557268&rfr_iscdi=true