Loading…
GEM-RL: Generalized Energy Management of Wearable Devices using Reinforcement Learning
Energy harvesting (EH) and management (EM) have emerged as enablers of self-sustained wearable devices. Since EH alone is not sufficient for self-sustainability due to uncertainties of ambient sources and user activities, there is a critical need for a user-independent EM approach that does not rely...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 6 |
container_issue | |
container_start_page | 1 |
container_title | |
container_volume | |
creator | Basaklar, Toygun Tuncel, Yigit Gumussoy, Suat Ogras, Umit |
description | Energy harvesting (EH) and management (EM) have emerged as enablers of self-sustained wearable devices. Since EH alone is not sufficient for self-sustainability due to uncertainties of ambient sources and user activities, there is a critical need for a user-independent EM approach that does not rely on expected EH predictions. We present a generalized energy management framework (GEM-RL) using multi-objective reinforcement learning. GEM-RL learns the trade-off between utilization and the battery energy level of the target device under dynamic EH patterns and battery conditions. It also uses a lightweight approximate dynamic programming (ADP) technique that utilizes the trained MORL agent to optimize the utilization of the device over a longer period. Thorough experiments show that, on average, GEM-RL achieves Pareto front solutions within 5.4% of the offline Oracle for a given day. For a 7-day horizon, it achieves utility up to 4% within the offline Oracle and up to 50% higher utility compared to baseline EM approaches. The hardware implementation on a wearable device shows negligible execution time (1.98 ms) and energy consumption (23.17 μJ) overhead. |
doi_str_mv | 10.23919/DATE56975.2023.10137228 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_10137228</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10137228</ieee_id><sourcerecordid>10137228</sourcerecordid><originalsourceid>FETCH-LOGICAL-i204t-23c707c9f69040463c77d9c5f6d94b3a115d35e45069e813eeb80823729a36c23</originalsourceid><addsrcrecordid>eNo1j1FLwzAUhYMgOKf_wIf8gdab3CZtfBtbrUKHMKY-jjS9LZEuk3YK89cbmD6dcz4OBw5jXEAq0Qhzv1psS6VNrlIJElMBAnMpiws2E0oViYjgil1P0wcAKJRmxt6qcp1s6gdeUaDRDv6HWl5G25_42gbb057CkR86_k52tM1AfEXf3tHEvyYfer4hH7rD6M69OpZCxDfssrPDRLd_Omevj-V2-ZTUL9XzclEnXkJ2TCS6HHJnOm0gg0zHmLfGqU63JmvQCqFaVJQp0IYKgURNAYWMp4xF7STO2d151xPR7nP0ezuedv-_8RdhyE51</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>GEM-RL: Generalized Energy Management of Wearable Devices using Reinforcement Learning</title><source>IEEE Xplore All Conference Series</source><creator>Basaklar, Toygun ; Tuncel, Yigit ; Gumussoy, Suat ; Ogras, Umit</creator><creatorcontrib>Basaklar, Toygun ; Tuncel, Yigit ; Gumussoy, Suat ; Ogras, Umit</creatorcontrib><description>Energy harvesting (EH) and management (EM) have emerged as enablers of self-sustained wearable devices. Since EH alone is not sufficient for self-sustainability due to uncertainties of ambient sources and user activities, there is a critical need for a user-independent EM approach that does not rely on expected EH predictions. We present a generalized energy management framework (GEM-RL) using multi-objective reinforcement learning. GEM-RL learns the trade-off between utilization and the battery energy level of the target device under dynamic EH patterns and battery conditions. It also uses a lightweight approximate dynamic programming (ADP) technique that utilizes the trained MORL agent to optimize the utilization of the device over a longer period. Thorough experiments show that, on average, GEM-RL achieves Pareto front solutions within 5.4% of the offline Oracle for a given day. For a 7-day horizon, it achieves utility up to 4% within the offline Oracle and up to 50% higher utility compared to baseline EM approaches. The hardware implementation on a wearable device shows negligible execution time (1.98 ms) and energy consumption (23.17 μJ) overhead.</description><identifier>EISSN: 1558-1101</identifier><identifier>DOI: 10.23919/DATE56975.2023.10137228</identifier><language>eng</language><publisher>EDAA</publisher><subject>Batteries ; dynamic programming ; Energy consumption ; Energy harvesting ; energy management ; Hardware ; multi-objective reinforcement learning ; Reinforcement learning ; Time measurement ; Uncertainty ; Wearable computers</subject><ispartof>2023 Design, Automation & Test in Europe Conference & Exhibition (DATE), 2023, p.1-6</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10137228$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,778,782,787,788,23917,23918,25127,27912,54542,54919</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10137228$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Basaklar, Toygun</creatorcontrib><creatorcontrib>Tuncel, Yigit</creatorcontrib><creatorcontrib>Gumussoy, Suat</creatorcontrib><creatorcontrib>Ogras, Umit</creatorcontrib><title>GEM-RL: Generalized Energy Management of Wearable Devices using Reinforcement Learning</title><title>2023 Design, Automation & Test in Europe Conference & Exhibition (DATE)</title><addtitle>DATE56975</addtitle><description>Energy harvesting (EH) and management (EM) have emerged as enablers of self-sustained wearable devices. Since EH alone is not sufficient for self-sustainability due to uncertainties of ambient sources and user activities, there is a critical need for a user-independent EM approach that does not rely on expected EH predictions. We present a generalized energy management framework (GEM-RL) using multi-objective reinforcement learning. GEM-RL learns the trade-off between utilization and the battery energy level of the target device under dynamic EH patterns and battery conditions. It also uses a lightweight approximate dynamic programming (ADP) technique that utilizes the trained MORL agent to optimize the utilization of the device over a longer period. Thorough experiments show that, on average, GEM-RL achieves Pareto front solutions within 5.4% of the offline Oracle for a given day. For a 7-day horizon, it achieves utility up to 4% within the offline Oracle and up to 50% higher utility compared to baseline EM approaches. The hardware implementation on a wearable device shows negligible execution time (1.98 ms) and energy consumption (23.17 μJ) overhead.</description><subject>Batteries</subject><subject>dynamic programming</subject><subject>Energy consumption</subject><subject>Energy harvesting</subject><subject>energy management</subject><subject>Hardware</subject><subject>multi-objective reinforcement learning</subject><subject>Reinforcement learning</subject><subject>Time measurement</subject><subject>Uncertainty</subject><subject>Wearable computers</subject><issn>1558-1101</issn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2023</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNo1j1FLwzAUhYMgOKf_wIf8gdab3CZtfBtbrUKHMKY-jjS9LZEuk3YK89cbmD6dcz4OBw5jXEAq0Qhzv1psS6VNrlIJElMBAnMpiws2E0oViYjgil1P0wcAKJRmxt6qcp1s6gdeUaDRDv6HWl5G25_42gbb057CkR86_k52tM1AfEXf3tHEvyYfer4hH7rD6M69OpZCxDfssrPDRLd_Omevj-V2-ZTUL9XzclEnXkJ2TCS6HHJnOm0gg0zHmLfGqU63JmvQCqFaVJQp0IYKgURNAYWMp4xF7STO2d151xPR7nP0ezuedv-_8RdhyE51</recordid><startdate>202304</startdate><enddate>202304</enddate><creator>Basaklar, Toygun</creator><creator>Tuncel, Yigit</creator><creator>Gumussoy, Suat</creator><creator>Ogras, Umit</creator><general>EDAA</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>202304</creationdate><title>GEM-RL: Generalized Energy Management of Wearable Devices using Reinforcement Learning</title><author>Basaklar, Toygun ; Tuncel, Yigit ; Gumussoy, Suat ; Ogras, Umit</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i204t-23c707c9f69040463c77d9c5f6d94b3a115d35e45069e813eeb80823729a36c23</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Batteries</topic><topic>dynamic programming</topic><topic>Energy consumption</topic><topic>Energy harvesting</topic><topic>energy management</topic><topic>Hardware</topic><topic>multi-objective reinforcement learning</topic><topic>Reinforcement learning</topic><topic>Time measurement</topic><topic>Uncertainty</topic><topic>Wearable computers</topic><toplevel>online_resources</toplevel><creatorcontrib>Basaklar, Toygun</creatorcontrib><creatorcontrib>Tuncel, Yigit</creatorcontrib><creatorcontrib>Gumussoy, Suat</creatorcontrib><creatorcontrib>Ogras, Umit</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE/IET Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Basaklar, Toygun</au><au>Tuncel, Yigit</au><au>Gumussoy, Suat</au><au>Ogras, Umit</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>GEM-RL: Generalized Energy Management of Wearable Devices using Reinforcement Learning</atitle><btitle>2023 Design, Automation & Test in Europe Conference & Exhibition (DATE)</btitle><stitle>DATE56975</stitle><date>2023-04</date><risdate>2023</risdate><spage>1</spage><epage>6</epage><pages>1-6</pages><eissn>1558-1101</eissn><abstract>Energy harvesting (EH) and management (EM) have emerged as enablers of self-sustained wearable devices. Since EH alone is not sufficient for self-sustainability due to uncertainties of ambient sources and user activities, there is a critical need for a user-independent EM approach that does not rely on expected EH predictions. We present a generalized energy management framework (GEM-RL) using multi-objective reinforcement learning. GEM-RL learns the trade-off between utilization and the battery energy level of the target device under dynamic EH patterns and battery conditions. It also uses a lightweight approximate dynamic programming (ADP) technique that utilizes the trained MORL agent to optimize the utilization of the device over a longer period. Thorough experiments show that, on average, GEM-RL achieves Pareto front solutions within 5.4% of the offline Oracle for a given day. For a 7-day horizon, it achieves utility up to 4% within the offline Oracle and up to 50% higher utility compared to baseline EM approaches. The hardware implementation on a wearable device shows negligible execution time (1.98 ms) and energy consumption (23.17 μJ) overhead.</abstract><pub>EDAA</pub><doi>10.23919/DATE56975.2023.10137228</doi><tpages>6</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | EISSN: 1558-1101 |
ispartof | 2023 Design, Automation & Test in Europe Conference & Exhibition (DATE), 2023, p.1-6 |
issn | 1558-1101 |
language | eng |
recordid | cdi_ieee_primary_10137228 |
source | IEEE Xplore All Conference Series |
subjects | Batteries dynamic programming Energy consumption Energy harvesting energy management Hardware multi-objective reinforcement learning Reinforcement learning Time measurement Uncertainty Wearable computers |
title | GEM-RL: Generalized Energy Management of Wearable Devices using Reinforcement Learning |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-15T21%3A30%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=GEM-RL:%20Generalized%20Energy%20Management%20of%20Wearable%20Devices%20using%20Reinforcement%20Learning&rft.btitle=2023%20Design,%20Automation%20&%20Test%20in%20Europe%20Conference%20&%20Exhibition%20(DATE)&rft.au=Basaklar,%20Toygun&rft.date=2023-04&rft.spage=1&rft.epage=6&rft.pages=1-6&rft.eissn=1558-1101&rft_id=info:doi/10.23919/DATE56975.2023.10137228&rft_dat=%3Cieee_CHZPO%3E10137228%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i204t-23c707c9f69040463c77d9c5f6d94b3a115d35e45069e813eeb80823729a36c23%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10137228&rfr_iscdi=true |