Loading…
A Non-Laboratory Gait Dataset of Full Body Kinematics and Egocentric Vision
In this manuscript, we describe a unique dataset of human locomotion captured in a variety of out-of-the-laboratory environments captured using Inertial Measurement Unit (IMU) based wearable motion capture. The data contain full-body kinematics for walking, with and without stops, stair ambulation,...
Saved in:
Published in: | Scientific data 2023-01, Vol.10 (1), p.26-11, Article 26 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c540t-21269d3b9a241f907ea34b8e050b7204793b0f391a2a6ffef1a9ae3d0af5bf1e3 |
---|---|
cites | cdi_FETCH-LOGICAL-c540t-21269d3b9a241f907ea34b8e050b7204793b0f391a2a6ffef1a9ae3d0af5bf1e3 |
container_end_page | 11 |
container_issue | 1 |
container_start_page | 26 |
container_title | Scientific data |
container_volume | 10 |
creator | Sharma, Abhishek Rai, Vijeth Calvert, Melissa Dai, Zhongyi Guo, Zhenghao Boe, David Rombokas, Eric |
description | In this manuscript, we describe a unique dataset of human locomotion captured in a variety of out-of-the-laboratory environments captured using Inertial Measurement Unit (IMU) based wearable motion capture. The data contain full-body kinematics for walking, with and without stops, stair ambulation, obstacle course navigation, dynamic movements intended to test agility, and negotiating common obstacles in public spaces such as chairs. The dataset contains 24.2 total hours of movement data from a college student population with an approximately equal split of males to females. In addition, for one of the activities, we captured the egocentric field of view and gaze of the subjects using an eye tracker. Finally, we provide some examples of applications using the dataset and discuss how it might open possibilities for new studies in human gait analysis. |
doi_str_mv | 10.1038/s41597-023-01932-7 |
format | article |
fullrecord | <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_76b503fce1f942dd808235fa4ae163fc</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_76b503fce1f942dd808235fa4ae163fc</doaj_id><sourcerecordid>2764915376</sourcerecordid><originalsourceid>FETCH-LOGICAL-c540t-21269d3b9a241f907ea34b8e050b7204793b0f391a2a6ffef1a9ae3d0af5bf1e3</originalsourceid><addsrcrecordid>eNp9kU1v1DAQhiMEolXpH-CALHHhEjr-ji9IpfRLXcEFuFqTxF68ysbFTpD23-NuSmk5cLI1884zH29VvabwngJvTrKg0ugaGK-BGs5q_aw6ZCBZLYTizx_9D6rjnDcAQLkAqeFldcCV4pJTdVjdnJLPcaxX2MaEU0w7colhIp9wwuwmEj25mIeBfIz9jtyE0W1xCl0mOPbkfB07N04pdOR7yCGOr6oXHofsju_fo-rbxfnXs6t69eXy-ux0VXdSwFQzypTpeWuQCeoNaIdctI0DCa1mILThLXhuKDJU3jtP0aDjPaCXraeOH1XXC7ePuLG3KWwx7WzEYPeBmNYWUxlzcFarVgL3nSuNBOv7BhrGpUeBjqoSL6wPC-t2breu3y-EwxPo08wYfth1_GVNwzVtmgJ4dw9I8efs8mS3IXduGHB0cc6WaSW1FkarIn37j3QT5zSWU92phKGS71VsUXUp5pycfxiGgr2z3i7W22K93VtvdSl683iNh5I_RhcBXwS5pMa1S397_wf7G3AouHA</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2764915376</pqid></control><display><type>article</type><title>A Non-Laboratory Gait Dataset of Full Body Kinematics and Egocentric Vision</title><source>Publicly Available Content Database</source><source>PubMed Central</source><source>Springer Nature - nature.com Journals - Fully Open Access</source><creator>Sharma, Abhishek ; Rai, Vijeth ; Calvert, Melissa ; Dai, Zhongyi ; Guo, Zhenghao ; Boe, David ; Rombokas, Eric</creator><creatorcontrib>Sharma, Abhishek ; Rai, Vijeth ; Calvert, Melissa ; Dai, Zhongyi ; Guo, Zhenghao ; Boe, David ; Rombokas, Eric</creatorcontrib><description>In this manuscript, we describe a unique dataset of human locomotion captured in a variety of out-of-the-laboratory environments captured using Inertial Measurement Unit (IMU) based wearable motion capture. The data contain full-body kinematics for walking, with and without stops, stair ambulation, obstacle course navigation, dynamic movements intended to test agility, and negotiating common obstacles in public spaces such as chairs. The dataset contains 24.2 total hours of movement data from a college student population with an approximately equal split of males to females. In addition, for one of the activities, we captured the egocentric field of view and gaze of the subjects using an eye tracker. Finally, we provide some examples of applications using the dataset and discuss how it might open possibilities for new studies in human gait analysis.</description><identifier>ISSN: 2052-4463</identifier><identifier>EISSN: 2052-4463</identifier><identifier>DOI: 10.1038/s41597-023-01932-7</identifier><identifier>PMID: 36635316</identifier><language>eng</language><publisher>London: Nature Publishing Group UK</publisher><subject>631/114/1305 ; 639/166/985 ; 639/166/988 ; 639/705/1046 ; 692/700/478 ; Biomechanical Phenomena ; Classrooms ; Computer engineering ; Data Descriptor ; Datasets ; Experiments ; Female ; Gait ; Humanities and Social Sciences ; Humans ; Kinematics ; Laboratories ; Locomotion ; Male ; Motion capture ; multidisciplinary ; Public spaces ; Science ; Science (multidisciplinary) ; Walking</subject><ispartof>Scientific data, 2023-01, Vol.10 (1), p.26-11, Article 26</ispartof><rights>The Author(s) 2023</rights><rights>2023. The Author(s).</rights><rights>The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c540t-21269d3b9a241f907ea34b8e050b7204793b0f391a2a6ffef1a9ae3d0af5bf1e3</citedby><cites>FETCH-LOGICAL-c540t-21269d3b9a241f907ea34b8e050b7204793b0f391a2a6ffef1a9ae3d0af5bf1e3</cites><orcidid>0000-0001-6666-2179</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2764915376/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2764915376?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,724,777,781,882,25734,27905,27906,36993,36994,44571,53772,53774,74875</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36635316$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Sharma, Abhishek</creatorcontrib><creatorcontrib>Rai, Vijeth</creatorcontrib><creatorcontrib>Calvert, Melissa</creatorcontrib><creatorcontrib>Dai, Zhongyi</creatorcontrib><creatorcontrib>Guo, Zhenghao</creatorcontrib><creatorcontrib>Boe, David</creatorcontrib><creatorcontrib>Rombokas, Eric</creatorcontrib><title>A Non-Laboratory Gait Dataset of Full Body Kinematics and Egocentric Vision</title><title>Scientific data</title><addtitle>Sci Data</addtitle><addtitle>Sci Data</addtitle><description>In this manuscript, we describe a unique dataset of human locomotion captured in a variety of out-of-the-laboratory environments captured using Inertial Measurement Unit (IMU) based wearable motion capture. The data contain full-body kinematics for walking, with and without stops, stair ambulation, obstacle course navigation, dynamic movements intended to test agility, and negotiating common obstacles in public spaces such as chairs. The dataset contains 24.2 total hours of movement data from a college student population with an approximately equal split of males to females. In addition, for one of the activities, we captured the egocentric field of view and gaze of the subjects using an eye tracker. Finally, we provide some examples of applications using the dataset and discuss how it might open possibilities for new studies in human gait analysis.</description><subject>631/114/1305</subject><subject>639/166/985</subject><subject>639/166/988</subject><subject>639/705/1046</subject><subject>692/700/478</subject><subject>Biomechanical Phenomena</subject><subject>Classrooms</subject><subject>Computer engineering</subject><subject>Data Descriptor</subject><subject>Datasets</subject><subject>Experiments</subject><subject>Female</subject><subject>Gait</subject><subject>Humanities and Social Sciences</subject><subject>Humans</subject><subject>Kinematics</subject><subject>Laboratories</subject><subject>Locomotion</subject><subject>Male</subject><subject>Motion capture</subject><subject>multidisciplinary</subject><subject>Public spaces</subject><subject>Science</subject><subject>Science (multidisciplinary)</subject><subject>Walking</subject><issn>2052-4463</issn><issn>2052-4463</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNp9kU1v1DAQhiMEolXpH-CALHHhEjr-ji9IpfRLXcEFuFqTxF68ysbFTpD23-NuSmk5cLI1884zH29VvabwngJvTrKg0ugaGK-BGs5q_aw6ZCBZLYTizx_9D6rjnDcAQLkAqeFldcCV4pJTdVjdnJLPcaxX2MaEU0w7colhIp9wwuwmEj25mIeBfIz9jtyE0W1xCl0mOPbkfB07N04pdOR7yCGOr6oXHofsju_fo-rbxfnXs6t69eXy-ux0VXdSwFQzypTpeWuQCeoNaIdctI0DCa1mILThLXhuKDJU3jtP0aDjPaCXraeOH1XXC7ePuLG3KWwx7WzEYPeBmNYWUxlzcFarVgL3nSuNBOv7BhrGpUeBjqoSL6wPC-t2breu3y-EwxPo08wYfth1_GVNwzVtmgJ4dw9I8efs8mS3IXduGHB0cc6WaSW1FkarIn37j3QT5zSWU92phKGS71VsUXUp5pycfxiGgr2z3i7W22K93VtvdSl683iNh5I_RhcBXwS5pMa1S397_wf7G3AouHA</recordid><startdate>20230112</startdate><enddate>20230112</enddate><creator>Sharma, Abhishek</creator><creator>Rai, Vijeth</creator><creator>Calvert, Melissa</creator><creator>Dai, Zhongyi</creator><creator>Guo, Zhenghao</creator><creator>Boe, David</creator><creator>Rombokas, Eric</creator><general>Nature Publishing Group UK</general><general>Nature Publishing Group</general><general>Nature Portfolio</general><scope>C6C</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8FE</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>LK8</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0001-6666-2179</orcidid></search><sort><creationdate>20230112</creationdate><title>A Non-Laboratory Gait Dataset of Full Body Kinematics and Egocentric Vision</title><author>Sharma, Abhishek ; Rai, Vijeth ; Calvert, Melissa ; Dai, Zhongyi ; Guo, Zhenghao ; Boe, David ; Rombokas, Eric</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c540t-21269d3b9a241f907ea34b8e050b7204793b0f391a2a6ffef1a9ae3d0af5bf1e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>631/114/1305</topic><topic>639/166/985</topic><topic>639/166/988</topic><topic>639/705/1046</topic><topic>692/700/478</topic><topic>Biomechanical Phenomena</topic><topic>Classrooms</topic><topic>Computer engineering</topic><topic>Data Descriptor</topic><topic>Datasets</topic><topic>Experiments</topic><topic>Female</topic><topic>Gait</topic><topic>Humanities and Social Sciences</topic><topic>Humans</topic><topic>Kinematics</topic><topic>Laboratories</topic><topic>Locomotion</topic><topic>Male</topic><topic>Motion capture</topic><topic>multidisciplinary</topic><topic>Public spaces</topic><topic>Science</topic><topic>Science (multidisciplinary)</topic><topic>Walking</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Sharma, Abhishek</creatorcontrib><creatorcontrib>Rai, Vijeth</creatorcontrib><creatorcontrib>Calvert, Melissa</creatorcontrib><creatorcontrib>Dai, Zhongyi</creatorcontrib><creatorcontrib>Guo, Zhenghao</creatorcontrib><creatorcontrib>Boe, David</creatorcontrib><creatorcontrib>Rombokas, Eric</creatorcontrib><collection>Springer Nature OA/Free Journals</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>ProQuest Biological Science Collection</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Biological Science Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Scientific data</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Sharma, Abhishek</au><au>Rai, Vijeth</au><au>Calvert, Melissa</au><au>Dai, Zhongyi</au><au>Guo, Zhenghao</au><au>Boe, David</au><au>Rombokas, Eric</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Non-Laboratory Gait Dataset of Full Body Kinematics and Egocentric Vision</atitle><jtitle>Scientific data</jtitle><stitle>Sci Data</stitle><addtitle>Sci Data</addtitle><date>2023-01-12</date><risdate>2023</risdate><volume>10</volume><issue>1</issue><spage>26</spage><epage>11</epage><pages>26-11</pages><artnum>26</artnum><issn>2052-4463</issn><eissn>2052-4463</eissn><abstract>In this manuscript, we describe a unique dataset of human locomotion captured in a variety of out-of-the-laboratory environments captured using Inertial Measurement Unit (IMU) based wearable motion capture. The data contain full-body kinematics for walking, with and without stops, stair ambulation, obstacle course navigation, dynamic movements intended to test agility, and negotiating common obstacles in public spaces such as chairs. The dataset contains 24.2 total hours of movement data from a college student population with an approximately equal split of males to females. In addition, for one of the activities, we captured the egocentric field of view and gaze of the subjects using an eye tracker. Finally, we provide some examples of applications using the dataset and discuss how it might open possibilities for new studies in human gait analysis.</abstract><cop>London</cop><pub>Nature Publishing Group UK</pub><pmid>36635316</pmid><doi>10.1038/s41597-023-01932-7</doi><tpages>11</tpages><orcidid>https://orcid.org/0000-0001-6666-2179</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2052-4463 |
ispartof | Scientific data, 2023-01, Vol.10 (1), p.26-11, Article 26 |
issn | 2052-4463 2052-4463 |
language | eng |
recordid | cdi_doaj_primary_oai_doaj_org_article_76b503fce1f942dd808235fa4ae163fc |
source | Publicly Available Content Database; PubMed Central; Springer Nature - nature.com Journals - Fully Open Access |
subjects | 631/114/1305 639/166/985 639/166/988 639/705/1046 692/700/478 Biomechanical Phenomena Classrooms Computer engineering Data Descriptor Datasets Experiments Female Gait Humanities and Social Sciences Humans Kinematics Laboratories Locomotion Male Motion capture multidisciplinary Public spaces Science Science (multidisciplinary) Walking |
title | A Non-Laboratory Gait Dataset of Full Body Kinematics and Egocentric Vision |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-19T15%3A48%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Non-Laboratory%20Gait%20Dataset%20of%20Full%20Body%20Kinematics%20and%20Egocentric%20Vision&rft.jtitle=Scientific%20data&rft.au=Sharma,%20Abhishek&rft.date=2023-01-12&rft.volume=10&rft.issue=1&rft.spage=26&rft.epage=11&rft.pages=26-11&rft.artnum=26&rft.issn=2052-4463&rft.eissn=2052-4463&rft_id=info:doi/10.1038/s41597-023-01932-7&rft_dat=%3Cproquest_doaj_%3E2764915376%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c540t-21269d3b9a241f907ea34b8e050b7204793b0f391a2a6ffef1a9ae3d0af5bf1e3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2764915376&rft_id=info:pmid/36635316&rfr_iscdi=true |