Loading…

Depth-Camera-Based Under-Blanket Sleep Posture Classification Using Anatomical Landmark-Guided Deep Learning Model

Emerging sleep health technologies will have an impact on monitoring patients with sleep disorders. This study proposes a new deep learning model architecture that improves the under-blanket sleep posture classification accuracy by leveraging the anatomical landmark feature through an attention stra...

Full description

Saved in:
Bibliographic Details
Published in:International journal of environmental research and public health 2022-10, Vol.19 (20), p.13491
Main Authors: Tam, Andy Yiu-Chau, Zha, Li-Wen, So, Bryan Pak-Hei, Lai, Derek Ka-Hei, Mao, Ye-Jiao, Lim, Hyo-Jung, Wong, Duo Wai-Chi, Cheung, James Chung-Wai
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c421t-371ad2c313db7ec9164ae3b358c7c61ff54e6d26198a561d1565c1867a6af7793
cites cdi_FETCH-LOGICAL-c421t-371ad2c313db7ec9164ae3b358c7c61ff54e6d26198a561d1565c1867a6af7793
container_end_page
container_issue 20
container_start_page 13491
container_title International journal of environmental research and public health
container_volume 19
creator Tam, Andy Yiu-Chau
Zha, Li-Wen
So, Bryan Pak-Hei
Lai, Derek Ka-Hei
Mao, Ye-Jiao
Lim, Hyo-Jung
Wong, Duo Wai-Chi
Cheung, James Chung-Wai
description Emerging sleep health technologies will have an impact on monitoring patients with sleep disorders. This study proposes a new deep learning model architecture that improves the under-blanket sleep posture classification accuracy by leveraging the anatomical landmark feature through an attention strategy. The system used an integrated visible light and depth camera. Deep learning models (ResNet-34, EfficientNet B4, and ECA-Net50) were trained using depth images. We compared the models with and without an anatomical landmark coordinate input generated with an open-source pose estimation model using visible image data. We recruited 120 participants to perform seven major sleep postures, namely, the supine posture, prone postures with the head turned left and right, left- and right-sided log postures, and left- and right-sided fetal postures under four blanket conditions, including no blanket, thin, medium, and thick. A data augmentation technique was applied to the blanket conditions. The data were sliced at an 8:2 training-to-testing ratio. The results showed that ECA-Net50 produced the best classification results. Incorporating the anatomical landmark features increased the F1 score of ECA-Net50 from 87.4% to 92.2%. Our findings also suggested that the classification performances of deep learning models guided with features of anatomical landmarks were less affected by the interference of blanket conditions.
doi_str_mv 10.3390/ijerph192013491
format article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_9603239</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2729526588</sourcerecordid><originalsourceid>FETCH-LOGICAL-c421t-371ad2c313db7ec9164ae3b358c7c61ff54e6d26198a561d1565c1867a6af7793</originalsourceid><addsrcrecordid>eNpdkU1P3DAQhq2qCChw7q2K1EsvKf5InPhSCRZKkRaBRPdszdoT1ktip3aCxL8nERQBpxnNPPNqZl5CvjL6UwhFj90WY79hilMmCsU-kX0mJc0LSdnnN_ke-ZLSllJRF1Ltkj0huSpoxfdJPMN-2OQL6DBCfgoJbbbyFmN-2oK_xyG7bRH77CakYYyYLVpIyTXOwOCCz1bJ-bvsxMMQuqnWZkvwtoN4n1-Mzk5aZ_PwEiH6GbwKFttDstNAm_DoJR6Q1e_zv4s_-fL64nJxssxNwdmQi4qB5UYwYdcVGsVkASjWoqxNZSRrmrJAablkqoZSMstKWRpWywokNFWlxAH59azbj-sOrUE_RGh1H9204KMO4PT7jncbfRcetJJUcDEL_HgRiOHfiGnQnUsG2-kxGMakecVVyWVZ1xP6_QO6DWP003kzVRc1V4xP1PEzZWJIKWLzugyjevZTf_Bzmvj29oZX_r-B4gmanZ12</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2728482912</pqid></control><display><type>article</type><title>Depth-Camera-Based Under-Blanket Sleep Posture Classification Using Anatomical Landmark-Guided Deep Learning Model</title><source>PubMed (Medline)</source><source>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</source><source>Full-Text Journals in Chemistry (Open access)</source><creator>Tam, Andy Yiu-Chau ; Zha, Li-Wen ; So, Bryan Pak-Hei ; Lai, Derek Ka-Hei ; Mao, Ye-Jiao ; Lim, Hyo-Jung ; Wong, Duo Wai-Chi ; Cheung, James Chung-Wai</creator><creatorcontrib>Tam, Andy Yiu-Chau ; Zha, Li-Wen ; So, Bryan Pak-Hei ; Lai, Derek Ka-Hei ; Mao, Ye-Jiao ; Lim, Hyo-Jung ; Wong, Duo Wai-Chi ; Cheung, James Chung-Wai</creatorcontrib><description>Emerging sleep health technologies will have an impact on monitoring patients with sleep disorders. This study proposes a new deep learning model architecture that improves the under-blanket sleep posture classification accuracy by leveraging the anatomical landmark feature through an attention strategy. The system used an integrated visible light and depth camera. Deep learning models (ResNet-34, EfficientNet B4, and ECA-Net50) were trained using depth images. We compared the models with and without an anatomical landmark coordinate input generated with an open-source pose estimation model using visible image data. We recruited 120 participants to perform seven major sleep postures, namely, the supine posture, prone postures with the head turned left and right, left- and right-sided log postures, and left- and right-sided fetal postures under four blanket conditions, including no blanket, thin, medium, and thick. A data augmentation technique was applied to the blanket conditions. The data were sliced at an 8:2 training-to-testing ratio. The results showed that ECA-Net50 produced the best classification results. Incorporating the anatomical landmark features increased the F1 score of ECA-Net50 from 87.4% to 92.2%. Our findings also suggested that the classification performances of deep learning models guided with features of anatomical landmarks were less affected by the interference of blanket conditions.</description><identifier>ISSN: 1660-4601</identifier><identifier>ISSN: 1661-7827</identifier><identifier>EISSN: 1660-4601</identifier><identifier>DOI: 10.3390/ijerph192013491</identifier><identifier>PMID: 36294072</identifier><language>eng</language><publisher>Switzerland: MDPI AG</publisher><subject>Accuracy ; Cameras ; Classification ; Data collection ; Deep Learning ; Experiments ; Fetuses ; Humans ; Neural networks ; Physiology ; Pose estimation ; Posture ; Sensors ; Sleep ; Sleep deprivation ; Sleep disorders ; Sleep Wake Disorders ; Support vector machines</subject><ispartof>International journal of environmental research and public health, 2022-10, Vol.19 (20), p.13491</ispartof><rights>2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2022 by the authors. 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c421t-371ad2c313db7ec9164ae3b358c7c61ff54e6d26198a561d1565c1867a6af7793</citedby><cites>FETCH-LOGICAL-c421t-371ad2c313db7ec9164ae3b358c7c61ff54e6d26198a561d1565c1867a6af7793</cites><orcidid>0000-0003-2239-3303 ; 0000-0001-7446-0569 ; 0000-0002-8805-1157 ; 0000-0001-6563-8177</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2728482912/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2728482912?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,25753,27924,27925,37012,37013,44590,53791,53793,75126</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36294072$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Tam, Andy Yiu-Chau</creatorcontrib><creatorcontrib>Zha, Li-Wen</creatorcontrib><creatorcontrib>So, Bryan Pak-Hei</creatorcontrib><creatorcontrib>Lai, Derek Ka-Hei</creatorcontrib><creatorcontrib>Mao, Ye-Jiao</creatorcontrib><creatorcontrib>Lim, Hyo-Jung</creatorcontrib><creatorcontrib>Wong, Duo Wai-Chi</creatorcontrib><creatorcontrib>Cheung, James Chung-Wai</creatorcontrib><title>Depth-Camera-Based Under-Blanket Sleep Posture Classification Using Anatomical Landmark-Guided Deep Learning Model</title><title>International journal of environmental research and public health</title><addtitle>Int J Environ Res Public Health</addtitle><description>Emerging sleep health technologies will have an impact on monitoring patients with sleep disorders. This study proposes a new deep learning model architecture that improves the under-blanket sleep posture classification accuracy by leveraging the anatomical landmark feature through an attention strategy. The system used an integrated visible light and depth camera. Deep learning models (ResNet-34, EfficientNet B4, and ECA-Net50) were trained using depth images. We compared the models with and without an anatomical landmark coordinate input generated with an open-source pose estimation model using visible image data. We recruited 120 participants to perform seven major sleep postures, namely, the supine posture, prone postures with the head turned left and right, left- and right-sided log postures, and left- and right-sided fetal postures under four blanket conditions, including no blanket, thin, medium, and thick. A data augmentation technique was applied to the blanket conditions. The data were sliced at an 8:2 training-to-testing ratio. The results showed that ECA-Net50 produced the best classification results. Incorporating the anatomical landmark features increased the F1 score of ECA-Net50 from 87.4% to 92.2%. Our findings also suggested that the classification performances of deep learning models guided with features of anatomical landmarks were less affected by the interference of blanket conditions.</description><subject>Accuracy</subject><subject>Cameras</subject><subject>Classification</subject><subject>Data collection</subject><subject>Deep Learning</subject><subject>Experiments</subject><subject>Fetuses</subject><subject>Humans</subject><subject>Neural networks</subject><subject>Physiology</subject><subject>Pose estimation</subject><subject>Posture</subject><subject>Sensors</subject><subject>Sleep</subject><subject>Sleep deprivation</subject><subject>Sleep disorders</subject><subject>Sleep Wake Disorders</subject><subject>Support vector machines</subject><issn>1660-4601</issn><issn>1661-7827</issn><issn>1660-4601</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNpdkU1P3DAQhq2qCChw7q2K1EsvKf5InPhSCRZKkRaBRPdszdoT1ktip3aCxL8nERQBpxnNPPNqZl5CvjL6UwhFj90WY79hilMmCsU-kX0mJc0LSdnnN_ke-ZLSllJRF1Ltkj0huSpoxfdJPMN-2OQL6DBCfgoJbbbyFmN-2oK_xyG7bRH77CakYYyYLVpIyTXOwOCCz1bJ-bvsxMMQuqnWZkvwtoN4n1-Mzk5aZ_PwEiH6GbwKFttDstNAm_DoJR6Q1e_zv4s_-fL64nJxssxNwdmQi4qB5UYwYdcVGsVkASjWoqxNZSRrmrJAablkqoZSMstKWRpWywokNFWlxAH59azbj-sOrUE_RGh1H9204KMO4PT7jncbfRcetJJUcDEL_HgRiOHfiGnQnUsG2-kxGMakecVVyWVZ1xP6_QO6DWP003kzVRc1V4xP1PEzZWJIKWLzugyjevZTf_Bzmvj29oZX_r-B4gmanZ12</recordid><startdate>20221018</startdate><enddate>20221018</enddate><creator>Tam, Andy Yiu-Chau</creator><creator>Zha, Li-Wen</creator><creator>So, Bryan Pak-Hei</creator><creator>Lai, Derek Ka-Hei</creator><creator>Mao, Ye-Jiao</creator><creator>Lim, Hyo-Jung</creator><creator>Wong, Duo Wai-Chi</creator><creator>Cheung, James Chung-Wai</creator><general>MDPI AG</general><general>MDPI</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8C1</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>K9.</scope><scope>M0S</scope><scope>M1P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0003-2239-3303</orcidid><orcidid>https://orcid.org/0000-0001-7446-0569</orcidid><orcidid>https://orcid.org/0000-0002-8805-1157</orcidid><orcidid>https://orcid.org/0000-0001-6563-8177</orcidid></search><sort><creationdate>20221018</creationdate><title>Depth-Camera-Based Under-Blanket Sleep Posture Classification Using Anatomical Landmark-Guided Deep Learning Model</title><author>Tam, Andy Yiu-Chau ; Zha, Li-Wen ; So, Bryan Pak-Hei ; Lai, Derek Ka-Hei ; Mao, Ye-Jiao ; Lim, Hyo-Jung ; Wong, Duo Wai-Chi ; Cheung, James Chung-Wai</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c421t-371ad2c313db7ec9164ae3b358c7c61ff54e6d26198a561d1565c1867a6af7793</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Accuracy</topic><topic>Cameras</topic><topic>Classification</topic><topic>Data collection</topic><topic>Deep Learning</topic><topic>Experiments</topic><topic>Fetuses</topic><topic>Humans</topic><topic>Neural networks</topic><topic>Physiology</topic><topic>Pose estimation</topic><topic>Posture</topic><topic>Sensors</topic><topic>Sleep</topic><topic>Sleep deprivation</topic><topic>Sleep disorders</topic><topic>Sleep Wake Disorders</topic><topic>Support vector machines</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Tam, Andy Yiu-Chau</creatorcontrib><creatorcontrib>Zha, Li-Wen</creatorcontrib><creatorcontrib>So, Bryan Pak-Hei</creatorcontrib><creatorcontrib>Lai, Derek Ka-Hei</creatorcontrib><creatorcontrib>Mao, Ye-Jiao</creatorcontrib><creatorcontrib>Lim, Hyo-Jung</creatorcontrib><creatorcontrib>Wong, Duo Wai-Chi</creatorcontrib><creatorcontrib>Cheung, James Chung-Wai</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>ProQuest Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Public Health Database</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>International journal of environmental research and public health</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Tam, Andy Yiu-Chau</au><au>Zha, Li-Wen</au><au>So, Bryan Pak-Hei</au><au>Lai, Derek Ka-Hei</au><au>Mao, Ye-Jiao</au><au>Lim, Hyo-Jung</au><au>Wong, Duo Wai-Chi</au><au>Cheung, James Chung-Wai</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Depth-Camera-Based Under-Blanket Sleep Posture Classification Using Anatomical Landmark-Guided Deep Learning Model</atitle><jtitle>International journal of environmental research and public health</jtitle><addtitle>Int J Environ Res Public Health</addtitle><date>2022-10-18</date><risdate>2022</risdate><volume>19</volume><issue>20</issue><spage>13491</spage><pages>13491-</pages><issn>1660-4601</issn><issn>1661-7827</issn><eissn>1660-4601</eissn><abstract>Emerging sleep health technologies will have an impact on monitoring patients with sleep disorders. This study proposes a new deep learning model architecture that improves the under-blanket sleep posture classification accuracy by leveraging the anatomical landmark feature through an attention strategy. The system used an integrated visible light and depth camera. Deep learning models (ResNet-34, EfficientNet B4, and ECA-Net50) were trained using depth images. We compared the models with and without an anatomical landmark coordinate input generated with an open-source pose estimation model using visible image data. We recruited 120 participants to perform seven major sleep postures, namely, the supine posture, prone postures with the head turned left and right, left- and right-sided log postures, and left- and right-sided fetal postures under four blanket conditions, including no blanket, thin, medium, and thick. A data augmentation technique was applied to the blanket conditions. The data were sliced at an 8:2 training-to-testing ratio. The results showed that ECA-Net50 produced the best classification results. Incorporating the anatomical landmark features increased the F1 score of ECA-Net50 from 87.4% to 92.2%. Our findings also suggested that the classification performances of deep learning models guided with features of anatomical landmarks were less affected by the interference of blanket conditions.</abstract><cop>Switzerland</cop><pub>MDPI AG</pub><pmid>36294072</pmid><doi>10.3390/ijerph192013491</doi><orcidid>https://orcid.org/0000-0003-2239-3303</orcidid><orcidid>https://orcid.org/0000-0001-7446-0569</orcidid><orcidid>https://orcid.org/0000-0002-8805-1157</orcidid><orcidid>https://orcid.org/0000-0001-6563-8177</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1660-4601
ispartof International journal of environmental research and public health, 2022-10, Vol.19 (20), p.13491
issn 1660-4601
1661-7827
1660-4601
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_9603239
source PubMed (Medline); Publicly Available Content Database (Proquest) (PQ_SDU_P3); Full-Text Journals in Chemistry (Open access)
subjects Accuracy
Cameras
Classification
Data collection
Deep Learning
Experiments
Fetuses
Humans
Neural networks
Physiology
Pose estimation
Posture
Sensors
Sleep
Sleep deprivation
Sleep disorders
Sleep Wake Disorders
Support vector machines
title Depth-Camera-Based Under-Blanket Sleep Posture Classification Using Anatomical Landmark-Guided Deep Learning Model
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T21%3A14%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Depth-Camera-Based%20Under-Blanket%20Sleep%20Posture%20Classification%20Using%20Anatomical%20Landmark-Guided%20Deep%20Learning%20Model&rft.jtitle=International%20journal%20of%20environmental%20research%20and%20public%20health&rft.au=Tam,%20Andy%20Yiu-Chau&rft.date=2022-10-18&rft.volume=19&rft.issue=20&rft.spage=13491&rft.pages=13491-&rft.issn=1660-4601&rft.eissn=1660-4601&rft_id=info:doi/10.3390/ijerph192013491&rft_dat=%3Cproquest_pubme%3E2729526588%3C/proquest_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c421t-371ad2c313db7ec9164ae3b358c7c61ff54e6d26198a561d1565c1867a6af7793%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2728482912&rft_id=info:pmid/36294072&rfr_iscdi=true