Loading…

SenseHunger: Machine Learning Approach to Hunger Detection Using Wearable Sensors

The perception of hunger and satiety is of great importance to maintaining a healthy body weight and avoiding chronic diseases such as obesity, underweight, or deficiency syndromes due to malnutrition. There are a number of disease patterns, characterized by a chronic loss of this perception. To our...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Switzerland), 2022-10, Vol.22 (20), p.7711
Main Authors: Irshad, Muhammad Tausif, Nisar, Muhammad Adeel, Huang, Xinyu, Hartz, Jana, Flak, Olaf, Li, Frédéric, Gouverneur, Philip, Piet, Artur, Oltmanns, Kerstin M, Grzegorzek, Marcin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c579t-372e6469a03aa467f45bba89a05d8c02232e94cbdda276a00c0f4a35941022ed3
cites cdi_FETCH-LOGICAL-c579t-372e6469a03aa467f45bba89a05d8c02232e94cbdda276a00c0f4a35941022ed3
container_end_page
container_issue 20
container_start_page 7711
container_title Sensors (Basel, Switzerland)
container_volume 22
creator Irshad, Muhammad Tausif
Nisar, Muhammad Adeel
Huang, Xinyu
Hartz, Jana
Flak, Olaf
Li, Frédéric
Gouverneur, Philip
Piet, Artur
Oltmanns, Kerstin M
Grzegorzek, Marcin
description The perception of hunger and satiety is of great importance to maintaining a healthy body weight and avoiding chronic diseases such as obesity, underweight, or deficiency syndromes due to malnutrition. There are a number of disease patterns, characterized by a chronic loss of this perception. To our best knowledge, hunger and satiety cannot be classified using non-invasive measurements. Aiming to develop an objective classification system, this paper presents a multimodal sensory system using associated signal processing and pattern recognition methods for hunger and satiety detection based on non-invasive monitoring. We used an Empatica E4 smartwatch, a RespiBan wearable device, and JINS MEME smart glasses to capture physiological signals from five healthy normal weight subjects inactively sitting on a chair in a state of hunger and satiety. After pre-processing the signals, we compared different feature extraction approaches, either based on manual feature engineering or deep feature learning. Comparative experiments were carried out to determine the most appropriate sensor channel, device, and classifier to reliably discriminate between hunger and satiety states. Our experiments showed that the most discriminative features come from three specific sensor modalities: Electrodermal Activity (EDA), infrared Thermopile (Tmp), and Blood Volume Pulse (BVP).
doi_str_mv 10.3390/s22207711
format article
fullrecord <record><control><sourceid>gale_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_5e7d8c3aa9034feb861dc7d11d6e5013</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A746532591</galeid><doaj_id>oai_doaj_org_article_5e7d8c3aa9034feb861dc7d11d6e5013</doaj_id><sourcerecordid>A746532591</sourcerecordid><originalsourceid>FETCH-LOGICAL-c579t-372e6469a03aa467f45bba89a05d8c02232e94cbdda276a00c0f4a35941022ed3</originalsourceid><addsrcrecordid>eNptkt9rFDEQgBdRsFYf_A8WfNGHq_m52fggHNXawomIFh_DbDK7zbGXnMmu4H_fbK9UTyQPSWa--YaEqaqXlJxxrsnbzBgjSlH6qDqhgolVW-6P_zo_rZ7lvCWEcc7bk-rrNwwZL-cwYHpXfwZ74wPWG4QUfBjq9X6fYgnWU6wPUP0BJ7STj6G-zgvyo7DQjVgvppjy8-pJD2PGF_f7aXV98fH7-eVq8-XT1fl6s7JS6WnFFcNGNBoIBxCN6oXsOmjLXbrWEsY4Qy1s5xww1QAhlvQCuNSCliQ6flpdHbwuwtbsk99B-m0ieHMXiGkwkCZvRzQSVXGWPppw0WPXNtRZ5Sh1DUpCeXG9P7j2c7dDZzFMCcYj6XEm-BszxF9GN0QzKorg9b0gxZ8z5snsfLY4jhAwztkwxbSkStz1evUPuo1zCuWrFqqVnLZC_6EGKA_woY-lr12kZq1EIzmTmhbq7D9UWQ533saAvS_xo4I3hwKbYs4J-4c3UmKWATIPA8RvATtHtbA</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2728531849</pqid></control><display><type>article</type><title>SenseHunger: Machine Learning Approach to Hunger Detection Using Wearable Sensors</title><source>Publicly Available Content Database</source><source>PubMed Central</source><source>Coronavirus Research Database</source><creator>Irshad, Muhammad Tausif ; Nisar, Muhammad Adeel ; Huang, Xinyu ; Hartz, Jana ; Flak, Olaf ; Li, Frédéric ; Gouverneur, Philip ; Piet, Artur ; Oltmanns, Kerstin M ; Grzegorzek, Marcin</creator><creatorcontrib>Irshad, Muhammad Tausif ; Nisar, Muhammad Adeel ; Huang, Xinyu ; Hartz, Jana ; Flak, Olaf ; Li, Frédéric ; Gouverneur, Philip ; Piet, Artur ; Oltmanns, Kerstin M ; Grzegorzek, Marcin</creatorcontrib><description>The perception of hunger and satiety is of great importance to maintaining a healthy body weight and avoiding chronic diseases such as obesity, underweight, or deficiency syndromes due to malnutrition. There are a number of disease patterns, characterized by a chronic loss of this perception. To our best knowledge, hunger and satiety cannot be classified using non-invasive measurements. Aiming to develop an objective classification system, this paper presents a multimodal sensory system using associated signal processing and pattern recognition methods for hunger and satiety detection based on non-invasive monitoring. We used an Empatica E4 smartwatch, a RespiBan wearable device, and JINS MEME smart glasses to capture physiological signals from five healthy normal weight subjects inactively sitting on a chair in a state of hunger and satiety. After pre-processing the signals, we compared different feature extraction approaches, either based on manual feature engineering or deep feature learning. Comparative experiments were carried out to determine the most appropriate sensor channel, device, and classifier to reliably discriminate between hunger and satiety states. Our experiments showed that the most discriminative features come from three specific sensor modalities: Electrodermal Activity (EDA), infrared Thermopile (Tmp), and Blood Volume Pulse (BVP).</description><identifier>ISSN: 1424-8220</identifier><identifier>EISSN: 1424-8220</identifier><identifier>DOI: 10.3390/s22207711</identifier><language>eng</language><publisher>Basel: MDPI AG</publisher><subject>Blood volume ; Body weight ; Chronic diseases ; Classification ; Connectivity ; Data processing ; Epidemiology ; Eyewear ; Hormones ; hunger ; Investigations ; Machine learning ; multimodal sensing ; Neural networks ; non-invasive sensing ; Overweight ; Pattern recognition ; Perception ; physiological signals ; Physiology ; satiety ; Sensors ; Signal processing ; Smartwatches ; Thermopiles ; Wearable technology</subject><ispartof>Sensors (Basel, Switzerland), 2022-10, Vol.22 (20), p.7711</ispartof><rights>COPYRIGHT 2022 MDPI AG</rights><rights>2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2022 by the authors. 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c579t-372e6469a03aa467f45bba89a05d8c02232e94cbdda276a00c0f4a35941022ed3</citedby><cites>FETCH-LOGICAL-c579t-372e6469a03aa467f45bba89a05d8c02232e94cbdda276a00c0f4a35941022ed3</cites><orcidid>0000-0003-3210-3891 ; 0000-0003-2110-4207 ; 0000-0001-8610-2291 ; 0000-0003-3288-750X ; 0000-0003-4581-4107 ; 0000-0003-4877-8287</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2728531849/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2728531849?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,724,777,781,882,25734,27905,27906,36993,36994,38497,43876,44571,53772,53774,74161,74875</link.rule.ids></links><search><creatorcontrib>Irshad, Muhammad Tausif</creatorcontrib><creatorcontrib>Nisar, Muhammad Adeel</creatorcontrib><creatorcontrib>Huang, Xinyu</creatorcontrib><creatorcontrib>Hartz, Jana</creatorcontrib><creatorcontrib>Flak, Olaf</creatorcontrib><creatorcontrib>Li, Frédéric</creatorcontrib><creatorcontrib>Gouverneur, Philip</creatorcontrib><creatorcontrib>Piet, Artur</creatorcontrib><creatorcontrib>Oltmanns, Kerstin M</creatorcontrib><creatorcontrib>Grzegorzek, Marcin</creatorcontrib><title>SenseHunger: Machine Learning Approach to Hunger Detection Using Wearable Sensors</title><title>Sensors (Basel, Switzerland)</title><description>The perception of hunger and satiety is of great importance to maintaining a healthy body weight and avoiding chronic diseases such as obesity, underweight, or deficiency syndromes due to malnutrition. There are a number of disease patterns, characterized by a chronic loss of this perception. To our best knowledge, hunger and satiety cannot be classified using non-invasive measurements. Aiming to develop an objective classification system, this paper presents a multimodal sensory system using associated signal processing and pattern recognition methods for hunger and satiety detection based on non-invasive monitoring. We used an Empatica E4 smartwatch, a RespiBan wearable device, and JINS MEME smart glasses to capture physiological signals from five healthy normal weight subjects inactively sitting on a chair in a state of hunger and satiety. After pre-processing the signals, we compared different feature extraction approaches, either based on manual feature engineering or deep feature learning. Comparative experiments were carried out to determine the most appropriate sensor channel, device, and classifier to reliably discriminate between hunger and satiety states. Our experiments showed that the most discriminative features come from three specific sensor modalities: Electrodermal Activity (EDA), infrared Thermopile (Tmp), and Blood Volume Pulse (BVP).</description><subject>Blood volume</subject><subject>Body weight</subject><subject>Chronic diseases</subject><subject>Classification</subject><subject>Connectivity</subject><subject>Data processing</subject><subject>Epidemiology</subject><subject>Eyewear</subject><subject>Hormones</subject><subject>hunger</subject><subject>Investigations</subject><subject>Machine learning</subject><subject>multimodal sensing</subject><subject>Neural networks</subject><subject>non-invasive sensing</subject><subject>Overweight</subject><subject>Pattern recognition</subject><subject>Perception</subject><subject>physiological signals</subject><subject>Physiology</subject><subject>satiety</subject><subject>Sensors</subject><subject>Signal processing</subject><subject>Smartwatches</subject><subject>Thermopiles</subject><subject>Wearable technology</subject><issn>1424-8220</issn><issn>1424-8220</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>COVID</sourceid><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNptkt9rFDEQgBdRsFYf_A8WfNGHq_m52fggHNXawomIFh_DbDK7zbGXnMmu4H_fbK9UTyQPSWa--YaEqaqXlJxxrsnbzBgjSlH6qDqhgolVW-6P_zo_rZ7lvCWEcc7bk-rrNwwZL-cwYHpXfwZ74wPWG4QUfBjq9X6fYgnWU6wPUP0BJ7STj6G-zgvyo7DQjVgvppjy8-pJD2PGF_f7aXV98fH7-eVq8-XT1fl6s7JS6WnFFcNGNBoIBxCN6oXsOmjLXbrWEsY4Qy1s5xww1QAhlvQCuNSCliQ6flpdHbwuwtbsk99B-m0ieHMXiGkwkCZvRzQSVXGWPppw0WPXNtRZ5Sh1DUpCeXG9P7j2c7dDZzFMCcYj6XEm-BszxF9GN0QzKorg9b0gxZ8z5snsfLY4jhAwztkwxbSkStz1evUPuo1zCuWrFqqVnLZC_6EGKA_woY-lr12kZq1EIzmTmhbq7D9UWQ533saAvS_xo4I3hwKbYs4J-4c3UmKWATIPA8RvATtHtbA</recordid><startdate>20221001</startdate><enddate>20221001</enddate><creator>Irshad, Muhammad Tausif</creator><creator>Nisar, Muhammad Adeel</creator><creator>Huang, Xinyu</creator><creator>Hartz, Jana</creator><creator>Flak, Olaf</creator><creator>Li, Frédéric</creator><creator>Gouverneur, Philip</creator><creator>Piet, Artur</creator><creator>Oltmanns, Kerstin M</creator><creator>Grzegorzek, Marcin</creator><general>MDPI AG</general><general>MDPI</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>COVID</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>K9.</scope><scope>M0S</scope><scope>M1P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0003-3210-3891</orcidid><orcidid>https://orcid.org/0000-0003-2110-4207</orcidid><orcidid>https://orcid.org/0000-0001-8610-2291</orcidid><orcidid>https://orcid.org/0000-0003-3288-750X</orcidid><orcidid>https://orcid.org/0000-0003-4581-4107</orcidid><orcidid>https://orcid.org/0000-0003-4877-8287</orcidid></search><sort><creationdate>20221001</creationdate><title>SenseHunger: Machine Learning Approach to Hunger Detection Using Wearable Sensors</title><author>Irshad, Muhammad Tausif ; Nisar, Muhammad Adeel ; Huang, Xinyu ; Hartz, Jana ; Flak, Olaf ; Li, Frédéric ; Gouverneur, Philip ; Piet, Artur ; Oltmanns, Kerstin M ; Grzegorzek, Marcin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c579t-372e6469a03aa467f45bba89a05d8c02232e94cbdda276a00c0f4a35941022ed3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Blood volume</topic><topic>Body weight</topic><topic>Chronic diseases</topic><topic>Classification</topic><topic>Connectivity</topic><topic>Data processing</topic><topic>Epidemiology</topic><topic>Eyewear</topic><topic>Hormones</topic><topic>hunger</topic><topic>Investigations</topic><topic>Machine learning</topic><topic>multimodal sensing</topic><topic>Neural networks</topic><topic>non-invasive sensing</topic><topic>Overweight</topic><topic>Pattern recognition</topic><topic>Perception</topic><topic>physiological signals</topic><topic>Physiology</topic><topic>satiety</topic><topic>Sensors</topic><topic>Signal processing</topic><topic>Smartwatches</topic><topic>Thermopiles</topic><topic>Wearable technology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Irshad, Muhammad Tausif</creatorcontrib><creatorcontrib>Nisar, Muhammad Adeel</creatorcontrib><creatorcontrib>Huang, Xinyu</creatorcontrib><creatorcontrib>Hartz, Jana</creatorcontrib><creatorcontrib>Flak, Olaf</creatorcontrib><creatorcontrib>Li, Frédéric</creatorcontrib><creatorcontrib>Gouverneur, Philip</creatorcontrib><creatorcontrib>Piet, Artur</creatorcontrib><creatorcontrib>Oltmanns, Kerstin M</creatorcontrib><creatorcontrib>Grzegorzek, Marcin</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>Coronavirus Research Database</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>Directory of Open Access Journals</collection><jtitle>Sensors (Basel, Switzerland)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Irshad, Muhammad Tausif</au><au>Nisar, Muhammad Adeel</au><au>Huang, Xinyu</au><au>Hartz, Jana</au><au>Flak, Olaf</au><au>Li, Frédéric</au><au>Gouverneur, Philip</au><au>Piet, Artur</au><au>Oltmanns, Kerstin M</au><au>Grzegorzek, Marcin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>SenseHunger: Machine Learning Approach to Hunger Detection Using Wearable Sensors</atitle><jtitle>Sensors (Basel, Switzerland)</jtitle><date>2022-10-01</date><risdate>2022</risdate><volume>22</volume><issue>20</issue><spage>7711</spage><pages>7711-</pages><issn>1424-8220</issn><eissn>1424-8220</eissn><abstract>The perception of hunger and satiety is of great importance to maintaining a healthy body weight and avoiding chronic diseases such as obesity, underweight, or deficiency syndromes due to malnutrition. There are a number of disease patterns, characterized by a chronic loss of this perception. To our best knowledge, hunger and satiety cannot be classified using non-invasive measurements. Aiming to develop an objective classification system, this paper presents a multimodal sensory system using associated signal processing and pattern recognition methods for hunger and satiety detection based on non-invasive monitoring. We used an Empatica E4 smartwatch, a RespiBan wearable device, and JINS MEME smart glasses to capture physiological signals from five healthy normal weight subjects inactively sitting on a chair in a state of hunger and satiety. After pre-processing the signals, we compared different feature extraction approaches, either based on manual feature engineering or deep feature learning. Comparative experiments were carried out to determine the most appropriate sensor channel, device, and classifier to reliably discriminate between hunger and satiety states. Our experiments showed that the most discriminative features come from three specific sensor modalities: Electrodermal Activity (EDA), infrared Thermopile (Tmp), and Blood Volume Pulse (BVP).</abstract><cop>Basel</cop><pub>MDPI AG</pub><doi>10.3390/s22207711</doi><orcidid>https://orcid.org/0000-0003-3210-3891</orcidid><orcidid>https://orcid.org/0000-0003-2110-4207</orcidid><orcidid>https://orcid.org/0000-0001-8610-2291</orcidid><orcidid>https://orcid.org/0000-0003-3288-750X</orcidid><orcidid>https://orcid.org/0000-0003-4581-4107</orcidid><orcidid>https://orcid.org/0000-0003-4877-8287</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1424-8220
ispartof Sensors (Basel, Switzerland), 2022-10, Vol.22 (20), p.7711
issn 1424-8220
1424-8220
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_5e7d8c3aa9034feb861dc7d11d6e5013
source Publicly Available Content Database; PubMed Central; Coronavirus Research Database
subjects Blood volume
Body weight
Chronic diseases
Classification
Connectivity
Data processing
Epidemiology
Eyewear
Hormones
hunger
Investigations
Machine learning
multimodal sensing
Neural networks
non-invasive sensing
Overweight
Pattern recognition
Perception
physiological signals
Physiology
satiety
Sensors
Signal processing
Smartwatches
Thermopiles
Wearable technology
title SenseHunger: Machine Learning Approach to Hunger Detection Using Wearable Sensors
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-19T19%3A41%3A43IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=SenseHunger:%20Machine%20Learning%20Approach%20to%20Hunger%20Detection%20Using%20Wearable%20Sensors&rft.jtitle=Sensors%20(Basel,%20Switzerland)&rft.au=Irshad,%20Muhammad%20Tausif&rft.date=2022-10-01&rft.volume=22&rft.issue=20&rft.spage=7711&rft.pages=7711-&rft.issn=1424-8220&rft.eissn=1424-8220&rft_id=info:doi/10.3390/s22207711&rft_dat=%3Cgale_doaj_%3EA746532591%3C/gale_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c579t-372e6469a03aa467f45bba89a05d8c02232e94cbdda276a00c0f4a35941022ed3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2728531849&rft_id=info:pmid/&rft_galeid=A746532591&rfr_iscdi=true