Loading…
Tiny Machine Learning for Concept Drift
Tiny machine learning (TML) is a new research area whose goal is to design machine and deep learning (DL) techniques able to operate in embedded systems and the Internet-of-Things (IoT) units, hence satisfying the severe technological constraints on memory, computation, and energy characterizing the...
Saved in:
Published in: | IEEE transaction on neural networks and learning systems 2024-06, Vol.35 (6), p.8470-8481 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c281t-6f979ec7cfff57ee4397b06e93538b42c1e5b8045ce5999f450c44d12b2819c83 |
---|---|
cites | |
container_end_page | 8481 |
container_issue | 6 |
container_start_page | 8470 |
container_title | IEEE transaction on neural networks and learning systems |
container_volume | 35 |
creator | Disabato, Simone Roveri, Manuel |
description | Tiny machine learning (TML) is a new research area whose goal is to design machine and deep learning (DL) techniques able to operate in embedded systems and the Internet-of-Things (IoT) units, hence satisfying the severe technological constraints on memory, computation, and energy characterizing these pervasive devices. Interestingly, the related literature mainly focused on reducing the computational and memory demand of the inference phase of machine and deep learning models. At the same time, the training is typically assumed to be carried out in cloud or edge computing systems (due to the larger memory and computational requirements). This assumption results in TML solutions that might become obsolete when the process generating the data is affected by concept drift (e.g., due to periodicity or seasonality effect, faults or malfunctioning affecting sensors or actuators, or changes in the users' behavior), a common situation in real-world application scenarios. For the first time in the literature, this article introduces a TML for concept drift (TML-CD) solution based on deep learning feature extractors and a k -nearest neighbors ( k -NNs) classifier integrating a hybrid adaptation module able to deal with concept drift affecting the data-generating process. This adaptation module continuously updates (in a passive way) the knowledge base of TML-CD and, at the same time, employs a change detection test (CDT) to inspect for changes (in an active way) to quickly adapt to concept drift by removing obsolete knowledge. Experimental results on both image and audio benchmarks show the effectiveness of the proposed solution, whilst the porting of TML-CD on three off-the-shelf micro-controller units (MCUs) shows the feasibility of what is proposed in real-world pervasive systems. |
doi_str_mv | 10.1109/TNNLS.2022.3229897 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3064715352</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9999257</ieee_id><sourcerecordid>2796161157</sourcerecordid><originalsourceid>FETCH-LOGICAL-c281t-6f979ec7cfff57ee4397b06e93538b42c1e5b8045ce5999f450c44d12b2819c83</originalsourceid><addsrcrecordid>eNpdkD1PwzAQhi0EolXpHwAJRWKAJcXfjkfU8iWFMlAkNitxz5CqTYqdDP33uLR0wMtZuuc93T0InRM8IgTr29l0mr-NKKZ0xCjVmVZHqE-JpCllWXZ8-KuPHhqGsMDxSSwk16eoxxQmQirSR9ezqt4kL4X9qmpIcih8XdWfiWt8Mm5qC-s2mfjKtWfoxBXLAMN9HaD3h_vZ-CnNXx-fx3d5amlG2lQ6rTRYZZ1zQgFwplWJJWgmWFZyagmIMsNcWBBaa8cFtpzPCS1jXNuMDdDNbu7aN98dhNasqmBhuSxqaLpgqNKSSEKEiujVP3TRdL6O2xmGJVdEMEEjRXeU9U0IHpxZ-2pV-I0h2GxNml-TZmvS7E3G0OV-dFeuYH6I_HmLwMUOqADg0I4XaRo3-wFct3Q6</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3064715352</pqid></control><display><type>article</type><title>Tiny Machine Learning for Concept Drift</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Disabato, Simone ; Roveri, Manuel</creator><creatorcontrib>Disabato, Simone ; Roveri, Manuel</creatorcontrib><description><![CDATA[Tiny machine learning (TML) is a new research area whose goal is to design machine and deep learning (DL) techniques able to operate in embedded systems and the Internet-of-Things (IoT) units, hence satisfying the severe technological constraints on memory, computation, and energy characterizing these pervasive devices. Interestingly, the related literature mainly focused on reducing the computational and memory demand of the inference phase of machine and deep learning models. At the same time, the training is typically assumed to be carried out in cloud or edge computing systems (due to the larger memory and computational requirements). This assumption results in TML solutions that might become obsolete when the process generating the data is affected by concept drift (e.g., due to periodicity or seasonality effect, faults or malfunctioning affecting sensors or actuators, or changes in the users' behavior), a common situation in real-world application scenarios. For the first time in the literature, this article introduces a TML for concept drift (TML-CD) solution based on deep learning feature extractors and a <inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-nearest neighbors (<inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-NNs) classifier integrating a hybrid adaptation module able to deal with concept drift affecting the data-generating process. This adaptation module continuously updates (in a passive way) the knowledge base of TML-CD and, at the same time, employs a change detection test (CDT) to inspect for changes (in an active way) to quickly adapt to concept drift by removing obsolete knowledge. Experimental results on both image and audio benchmarks show the effectiveness of the proposed solution, whilst the porting of TML-CD on three off-the-shelf micro-controller units (MCUs) shows the feasibility of what is proposed in real-world pervasive systems.]]></description><identifier>ISSN: 2162-237X</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNNLS.2022.3229897</identifier><identifier>PMID: 37015671</identifier><identifier>CODEN: ITNNAL</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Actuators ; Adaptation ; Benchmarks ; Change detection ; Computer applications ; Computer memory ; concept drift ; Concept learning ; Deep learning ; deep learning (DL) ; Drift ; Edge computing ; Embedded systems ; Feature extraction ; Internet of Things ; k-nearest neighbor (k-NN) ; Knowledge bases (artificial intelligence) ; Learning algorithms ; Learning systems ; Machine learning ; Memory management ; Modules ; Obsolescence ; Periodicity ; Seasonal variations ; tiny machine learning (TML) ; Training</subject><ispartof>IEEE transaction on neural networks and learning systems, 2024-06, Vol.35 (6), p.8470-8481</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c281t-6f979ec7cfff57ee4397b06e93538b42c1e5b8045ce5999f450c44d12b2819c83</citedby><orcidid>0000-0001-5991-2651 ; 0000-0001-7828-7687</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9999257$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,54796</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/37015671$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Disabato, Simone</creatorcontrib><creatorcontrib>Roveri, Manuel</creatorcontrib><title>Tiny Machine Learning for Concept Drift</title><title>IEEE transaction on neural networks and learning systems</title><addtitle>TNNLS</addtitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><description><![CDATA[Tiny machine learning (TML) is a new research area whose goal is to design machine and deep learning (DL) techniques able to operate in embedded systems and the Internet-of-Things (IoT) units, hence satisfying the severe technological constraints on memory, computation, and energy characterizing these pervasive devices. Interestingly, the related literature mainly focused on reducing the computational and memory demand of the inference phase of machine and deep learning models. At the same time, the training is typically assumed to be carried out in cloud or edge computing systems (due to the larger memory and computational requirements). This assumption results in TML solutions that might become obsolete when the process generating the data is affected by concept drift (e.g., due to periodicity or seasonality effect, faults or malfunctioning affecting sensors or actuators, or changes in the users' behavior), a common situation in real-world application scenarios. For the first time in the literature, this article introduces a TML for concept drift (TML-CD) solution based on deep learning feature extractors and a <inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-nearest neighbors (<inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-NNs) classifier integrating a hybrid adaptation module able to deal with concept drift affecting the data-generating process. This adaptation module continuously updates (in a passive way) the knowledge base of TML-CD and, at the same time, employs a change detection test (CDT) to inspect for changes (in an active way) to quickly adapt to concept drift by removing obsolete knowledge. Experimental results on both image and audio benchmarks show the effectiveness of the proposed solution, whilst the porting of TML-CD on three off-the-shelf micro-controller units (MCUs) shows the feasibility of what is proposed in real-world pervasive systems.]]></description><subject>Actuators</subject><subject>Adaptation</subject><subject>Benchmarks</subject><subject>Change detection</subject><subject>Computer applications</subject><subject>Computer memory</subject><subject>concept drift</subject><subject>Concept learning</subject><subject>Deep learning</subject><subject>deep learning (DL)</subject><subject>Drift</subject><subject>Edge computing</subject><subject>Embedded systems</subject><subject>Feature extraction</subject><subject>Internet of Things</subject><subject>k-nearest neighbor (k-NN)</subject><subject>Knowledge bases (artificial intelligence)</subject><subject>Learning algorithms</subject><subject>Learning systems</subject><subject>Machine learning</subject><subject>Memory management</subject><subject>Modules</subject><subject>Obsolescence</subject><subject>Periodicity</subject><subject>Seasonal variations</subject><subject>tiny machine learning (TML)</subject><subject>Training</subject><issn>2162-237X</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNpdkD1PwzAQhi0EolXpHwAJRWKAJcXfjkfU8iWFMlAkNitxz5CqTYqdDP33uLR0wMtZuuc93T0InRM8IgTr29l0mr-NKKZ0xCjVmVZHqE-JpCllWXZ8-KuPHhqGsMDxSSwk16eoxxQmQirSR9ezqt4kL4X9qmpIcih8XdWfiWt8Mm5qC-s2mfjKtWfoxBXLAMN9HaD3h_vZ-CnNXx-fx3d5amlG2lQ6rTRYZZ1zQgFwplWJJWgmWFZyagmIMsNcWBBaa8cFtpzPCS1jXNuMDdDNbu7aN98dhNasqmBhuSxqaLpgqNKSSEKEiujVP3TRdL6O2xmGJVdEMEEjRXeU9U0IHpxZ-2pV-I0h2GxNml-TZmvS7E3G0OV-dFeuYH6I_HmLwMUOqADg0I4XaRo3-wFct3Q6</recordid><startdate>20240601</startdate><enddate>20240601</enddate><creator>Disabato, Simone</creator><creator>Roveri, Manuel</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QP</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>P64</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0001-5991-2651</orcidid><orcidid>https://orcid.org/0000-0001-7828-7687</orcidid></search><sort><creationdate>20240601</creationdate><title>Tiny Machine Learning for Concept Drift</title><author>Disabato, Simone ; Roveri, Manuel</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c281t-6f979ec7cfff57ee4397b06e93538b42c1e5b8045ce5999f450c44d12b2819c83</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Actuators</topic><topic>Adaptation</topic><topic>Benchmarks</topic><topic>Change detection</topic><topic>Computer applications</topic><topic>Computer memory</topic><topic>concept drift</topic><topic>Concept learning</topic><topic>Deep learning</topic><topic>deep learning (DL)</topic><topic>Drift</topic><topic>Edge computing</topic><topic>Embedded systems</topic><topic>Feature extraction</topic><topic>Internet of Things</topic><topic>k-nearest neighbor (k-NN)</topic><topic>Knowledge bases (artificial intelligence)</topic><topic>Learning algorithms</topic><topic>Learning systems</topic><topic>Machine learning</topic><topic>Memory management</topic><topic>Modules</topic><topic>Obsolescence</topic><topic>Periodicity</topic><topic>Seasonal variations</topic><topic>tiny machine learning (TML)</topic><topic>Training</topic><toplevel>online_resources</toplevel><creatorcontrib>Disabato, Simone</creatorcontrib><creatorcontrib>Roveri, Manuel</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEL</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium & Calcified Tissue Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Disabato, Simone</au><au>Roveri, Manuel</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Tiny Machine Learning for Concept Drift</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><stitle>TNNLS</stitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><date>2024-06-01</date><risdate>2024</risdate><volume>35</volume><issue>6</issue><spage>8470</spage><epage>8481</epage><pages>8470-8481</pages><issn>2162-237X</issn><eissn>2162-2388</eissn><coden>ITNNAL</coden><abstract><![CDATA[Tiny machine learning (TML) is a new research area whose goal is to design machine and deep learning (DL) techniques able to operate in embedded systems and the Internet-of-Things (IoT) units, hence satisfying the severe technological constraints on memory, computation, and energy characterizing these pervasive devices. Interestingly, the related literature mainly focused on reducing the computational and memory demand of the inference phase of machine and deep learning models. At the same time, the training is typically assumed to be carried out in cloud or edge computing systems (due to the larger memory and computational requirements). This assumption results in TML solutions that might become obsolete when the process generating the data is affected by concept drift (e.g., due to periodicity or seasonality effect, faults or malfunctioning affecting sensors or actuators, or changes in the users' behavior), a common situation in real-world application scenarios. For the first time in the literature, this article introduces a TML for concept drift (TML-CD) solution based on deep learning feature extractors and a <inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-nearest neighbors (<inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-NNs) classifier integrating a hybrid adaptation module able to deal with concept drift affecting the data-generating process. This adaptation module continuously updates (in a passive way) the knowledge base of TML-CD and, at the same time, employs a change detection test (CDT) to inspect for changes (in an active way) to quickly adapt to concept drift by removing obsolete knowledge. Experimental results on both image and audio benchmarks show the effectiveness of the proposed solution, whilst the porting of TML-CD on three off-the-shelf micro-controller units (MCUs) shows the feasibility of what is proposed in real-world pervasive systems.]]></abstract><cop>United States</cop><pub>IEEE</pub><pmid>37015671</pmid><doi>10.1109/TNNLS.2022.3229897</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0001-5991-2651</orcidid><orcidid>https://orcid.org/0000-0001-7828-7687</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2162-237X |
ispartof | IEEE transaction on neural networks and learning systems, 2024-06, Vol.35 (6), p.8470-8481 |
issn | 2162-237X 2162-2388 |
language | eng |
recordid | cdi_proquest_journals_3064715352 |
source | IEEE Electronic Library (IEL) Journals |
subjects | Actuators Adaptation Benchmarks Change detection Computer applications Computer memory concept drift Concept learning Deep learning deep learning (DL) Drift Edge computing Embedded systems Feature extraction Internet of Things k-nearest neighbor (k-NN) Knowledge bases (artificial intelligence) Learning algorithms Learning systems Machine learning Memory management Modules Obsolescence Periodicity Seasonal variations tiny machine learning (TML) Training |
title | Tiny Machine Learning for Concept Drift |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-05T05%3A13%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Tiny%20Machine%20Learning%20for%20Concept%20Drift&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Disabato,%20Simone&rft.date=2024-06-01&rft.volume=35&rft.issue=6&rft.spage=8470&rft.epage=8481&rft.pages=8470-8481&rft.issn=2162-237X&rft.eissn=2162-2388&rft.coden=ITNNAL&rft_id=info:doi/10.1109/TNNLS.2022.3229897&rft_dat=%3Cproquest_cross%3E2796161157%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c281t-6f979ec7cfff57ee4397b06e93538b42c1e5b8045ce5999f450c44d12b2819c83%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3064715352&rft_id=info:pmid/37015671&rft_ieee_id=9999257&rfr_iscdi=true |