Loading…
Unsupervised domain adaptation by incremental learning for concept drifting data streams
Incremental learning is a learning paradigm in which a model is updated continuously as new data becomes available, and its main challenge is to adapt to non-stationary environments without the time-consuming re-training process. Many efforts have been made on incremental supervised learning. Howeve...
Saved in:
Published in: | International journal of machine learning and cybernetics 2024-09, Vol.15 (9), p.4055-4078 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | cdi_FETCH-LOGICAL-c270t-7f9003f4c1fcbe4930690b032763d25599fb4ef5da1b01946612353de07e30b33 |
container_end_page | 4078 |
container_issue | 9 |
container_start_page | 4055 |
container_title | International journal of machine learning and cybernetics |
container_volume | 15 |
creator | Moradi, Mona Rahmanimanesh, Mohammad Shahzadi, Ali |
description | Incremental learning is a learning paradigm in which a model is updated continuously as new data becomes available, and its main challenge is to adapt to non-stationary environments without the time-consuming re-training process. Many efforts have been made on incremental supervised learning. However, providing sufficient labeled data remains a major problem. Recently, domain adaptation methods have gained attention. These methods aim to leverage the knowledge from an auxiliary source domain to boost the performance of the model in the target domain by reducing the domain discrepancy between them. Regarding these issues, in the present paper, a proposed model aims to incrementally learn a new domain characterized by drifts due to a non-stationary environment. It utilizes an unsupervised, fuzzy-based domain adaptation to classify data streams faced with concept drift while accounting for a label-agnostic incremental setting in the target domain. Incremental learning updates occur whenever the entropy-based metric indicates uncertainty, ensuring informative samples are integrated. Also, outdated samples are forgotten during the training stage using the dynamic sample weighting strategy. Through experimentation on forty-five tasks, the superiority of the proposed model in handling dynamic adaptation on non-stationary domains is demonstrated, showcasing improvements in accuracy and computational efficiency. |
doi_str_mv | 10.1007/s13042-024-02135-1 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3093952166</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3093952166</sourcerecordid><originalsourceid>FETCH-LOGICAL-c270t-7f9003f4c1fcbe4930690b032763d25599fb4ef5da1b01946612353de07e30b33</originalsourceid><addsrcrecordid>eNp9kE9rwzAMxc3YYKXrF9jJsHM22Uqc-DjK_hQKu6zQm3Fiu6Q0Tma7g377pcvYbhMICfHeE_wIuWVwzwDKh8gQcp4Bz8dmWGTsgsxYJaqsgmp7-buX7JosYtzDWAIQgc_IduPjcbDhs43WUNN3uvVUGz0kndre0_pEW98E21mf9IEerA6-9Tvq-kCb3jd2SNSE1qXz0eikaUzB6i7ekCunD9EufuacbJ6f3pev2frtZbV8XGcNLyFlpZMA6PKGuaa2uUQQEmpAXgo0vCikdHVuXWE0q4HJXAjGsUBjobQINeKc3E25Q-g_jjYmte-PwY8vFYJEWXAmxKjik6oJfYzBOjWEttPhpBioM0Q1QVQjRPUNUbHRhJMpjmK_s-Ev-h_XF-ELdIQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3093952166</pqid></control><display><type>article</type><title>Unsupervised domain adaptation by incremental learning for concept drifting data streams</title><source>Springer Link</source><creator>Moradi, Mona ; Rahmanimanesh, Mohammad ; Shahzadi, Ali</creator><creatorcontrib>Moradi, Mona ; Rahmanimanesh, Mohammad ; Shahzadi, Ali</creatorcontrib><description>Incremental learning is a learning paradigm in which a model is updated continuously as new data becomes available, and its main challenge is to adapt to non-stationary environments without the time-consuming re-training process. Many efforts have been made on incremental supervised learning. However, providing sufficient labeled data remains a major problem. Recently, domain adaptation methods have gained attention. These methods aim to leverage the knowledge from an auxiliary source domain to boost the performance of the model in the target domain by reducing the domain discrepancy between them. Regarding these issues, in the present paper, a proposed model aims to incrementally learn a new domain characterized by drifts due to a non-stationary environment. It utilizes an unsupervised, fuzzy-based domain adaptation to classify data streams faced with concept drift while accounting for a label-agnostic incremental setting in the target domain. Incremental learning updates occur whenever the entropy-based metric indicates uncertainty, ensuring informative samples are integrated. Also, outdated samples are forgotten during the training stage using the dynamic sample weighting strategy. Through experimentation on forty-five tasks, the superiority of the proposed model in handling dynamic adaptation on non-stationary domains is demonstrated, showcasing improvements in accuracy and computational efficiency.</description><identifier>ISSN: 1868-8071</identifier><identifier>EISSN: 1868-808X</identifier><identifier>DOI: 10.1007/s13042-024-02135-1</identifier><language>eng</language><publisher>Berlin/Heidelberg: Springer Berlin Heidelberg</publisher><subject>Adaptation ; Artificial Intelligence ; Complex Systems ; Computational Intelligence ; Control ; Data transmission ; Decision making ; Drift ; Engineering ; Knowledge ; Learning ; Mechatronics ; Methods ; Nonstationary environments ; Original Article ; Pattern Recognition ; Robotics ; Supervised learning ; Systems Biology</subject><ispartof>International journal of machine learning and cybernetics, 2024-09, Vol.15 (9), p.4055-4078</ispartof><rights>The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2024. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c270t-7f9003f4c1fcbe4930690b032763d25599fb4ef5da1b01946612353de07e30b33</cites><orcidid>0000-0001-7881-6972</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27915,27916</link.rule.ids></links><search><creatorcontrib>Moradi, Mona</creatorcontrib><creatorcontrib>Rahmanimanesh, Mohammad</creatorcontrib><creatorcontrib>Shahzadi, Ali</creatorcontrib><title>Unsupervised domain adaptation by incremental learning for concept drifting data streams</title><title>International journal of machine learning and cybernetics</title><addtitle>Int. J. Mach. Learn. & Cyber</addtitle><description>Incremental learning is a learning paradigm in which a model is updated continuously as new data becomes available, and its main challenge is to adapt to non-stationary environments without the time-consuming re-training process. Many efforts have been made on incremental supervised learning. However, providing sufficient labeled data remains a major problem. Recently, domain adaptation methods have gained attention. These methods aim to leverage the knowledge from an auxiliary source domain to boost the performance of the model in the target domain by reducing the domain discrepancy between them. Regarding these issues, in the present paper, a proposed model aims to incrementally learn a new domain characterized by drifts due to a non-stationary environment. It utilizes an unsupervised, fuzzy-based domain adaptation to classify data streams faced with concept drift while accounting for a label-agnostic incremental setting in the target domain. Incremental learning updates occur whenever the entropy-based metric indicates uncertainty, ensuring informative samples are integrated. Also, outdated samples are forgotten during the training stage using the dynamic sample weighting strategy. Through experimentation on forty-five tasks, the superiority of the proposed model in handling dynamic adaptation on non-stationary domains is demonstrated, showcasing improvements in accuracy and computational efficiency.</description><subject>Adaptation</subject><subject>Artificial Intelligence</subject><subject>Complex Systems</subject><subject>Computational Intelligence</subject><subject>Control</subject><subject>Data transmission</subject><subject>Decision making</subject><subject>Drift</subject><subject>Engineering</subject><subject>Knowledge</subject><subject>Learning</subject><subject>Mechatronics</subject><subject>Methods</subject><subject>Nonstationary environments</subject><subject>Original Article</subject><subject>Pattern Recognition</subject><subject>Robotics</subject><subject>Supervised learning</subject><subject>Systems Biology</subject><issn>1868-8071</issn><issn>1868-808X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNp9kE9rwzAMxc3YYKXrF9jJsHM22Uqc-DjK_hQKu6zQm3Fiu6Q0Tma7g377pcvYbhMICfHeE_wIuWVwzwDKh8gQcp4Bz8dmWGTsgsxYJaqsgmp7-buX7JosYtzDWAIQgc_IduPjcbDhs43WUNN3uvVUGz0kndre0_pEW98E21mf9IEerA6-9Tvq-kCb3jd2SNSE1qXz0eikaUzB6i7ekCunD9EufuacbJ6f3pev2frtZbV8XGcNLyFlpZMA6PKGuaa2uUQQEmpAXgo0vCikdHVuXWE0q4HJXAjGsUBjobQINeKc3E25Q-g_jjYmte-PwY8vFYJEWXAmxKjik6oJfYzBOjWEttPhpBioM0Q1QVQjRPUNUbHRhJMpjmK_s-Ev-h_XF-ELdIQ</recordid><startdate>20240901</startdate><enddate>20240901</enddate><creator>Moradi, Mona</creator><creator>Rahmanimanesh, Mohammad</creator><creator>Shahzadi, Ali</creator><general>Springer Berlin Heidelberg</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>JQ2</scope><orcidid>https://orcid.org/0000-0001-7881-6972</orcidid></search><sort><creationdate>20240901</creationdate><title>Unsupervised domain adaptation by incremental learning for concept drifting data streams</title><author>Moradi, Mona ; Rahmanimanesh, Mohammad ; Shahzadi, Ali</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c270t-7f9003f4c1fcbe4930690b032763d25599fb4ef5da1b01946612353de07e30b33</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Adaptation</topic><topic>Artificial Intelligence</topic><topic>Complex Systems</topic><topic>Computational Intelligence</topic><topic>Control</topic><topic>Data transmission</topic><topic>Decision making</topic><topic>Drift</topic><topic>Engineering</topic><topic>Knowledge</topic><topic>Learning</topic><topic>Mechatronics</topic><topic>Methods</topic><topic>Nonstationary environments</topic><topic>Original Article</topic><topic>Pattern Recognition</topic><topic>Robotics</topic><topic>Supervised learning</topic><topic>Systems Biology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Moradi, Mona</creatorcontrib><creatorcontrib>Rahmanimanesh, Mohammad</creatorcontrib><creatorcontrib>Shahzadi, Ali</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Computer Science Collection</collection><jtitle>International journal of machine learning and cybernetics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Moradi, Mona</au><au>Rahmanimanesh, Mohammad</au><au>Shahzadi, Ali</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Unsupervised domain adaptation by incremental learning for concept drifting data streams</atitle><jtitle>International journal of machine learning and cybernetics</jtitle><stitle>Int. J. Mach. Learn. & Cyber</stitle><date>2024-09-01</date><risdate>2024</risdate><volume>15</volume><issue>9</issue><spage>4055</spage><epage>4078</epage><pages>4055-4078</pages><issn>1868-8071</issn><eissn>1868-808X</eissn><abstract>Incremental learning is a learning paradigm in which a model is updated continuously as new data becomes available, and its main challenge is to adapt to non-stationary environments without the time-consuming re-training process. Many efforts have been made on incremental supervised learning. However, providing sufficient labeled data remains a major problem. Recently, domain adaptation methods have gained attention. These methods aim to leverage the knowledge from an auxiliary source domain to boost the performance of the model in the target domain by reducing the domain discrepancy between them. Regarding these issues, in the present paper, a proposed model aims to incrementally learn a new domain characterized by drifts due to a non-stationary environment. It utilizes an unsupervised, fuzzy-based domain adaptation to classify data streams faced with concept drift while accounting for a label-agnostic incremental setting in the target domain. Incremental learning updates occur whenever the entropy-based metric indicates uncertainty, ensuring informative samples are integrated. Also, outdated samples are forgotten during the training stage using the dynamic sample weighting strategy. Through experimentation on forty-five tasks, the superiority of the proposed model in handling dynamic adaptation on non-stationary domains is demonstrated, showcasing improvements in accuracy and computational efficiency.</abstract><cop>Berlin/Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/s13042-024-02135-1</doi><tpages>24</tpages><orcidid>https://orcid.org/0000-0001-7881-6972</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1868-8071 |
ispartof | International journal of machine learning and cybernetics, 2024-09, Vol.15 (9), p.4055-4078 |
issn | 1868-8071 1868-808X |
language | eng |
recordid | cdi_proquest_journals_3093952166 |
source | Springer Link |
subjects | Adaptation Artificial Intelligence Complex Systems Computational Intelligence Control Data transmission Decision making Drift Engineering Knowledge Learning Mechatronics Methods Nonstationary environments Original Article Pattern Recognition Robotics Supervised learning Systems Biology |
title | Unsupervised domain adaptation by incremental learning for concept drifting data streams |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T23%3A31%3A46IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Unsupervised%20domain%20adaptation%20by%20incremental%20learning%20for%20concept%20drifting%20data%20streams&rft.jtitle=International%20journal%20of%20machine%20learning%20and%20cybernetics&rft.au=Moradi,%20Mona&rft.date=2024-09-01&rft.volume=15&rft.issue=9&rft.spage=4055&rft.epage=4078&rft.pages=4055-4078&rft.issn=1868-8071&rft.eissn=1868-808X&rft_id=info:doi/10.1007/s13042-024-02135-1&rft_dat=%3Cproquest_cross%3E3093952166%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c270t-7f9003f4c1fcbe4930690b032763d25599fb4ef5da1b01946612353de07e30b33%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3093952166&rft_id=info:pmid/&rfr_iscdi=true |