Loading…

Robust noise-aware algorithm for randomized neural network and its convergence properties

The concept of randomized neural networks (RNNs), such as the random vector functional link network (RVFL) and extreme learning machine (ELM), is a widely accepted and efficient network method for constructing single-hidden layer feedforward networks (SLFNs). Due to its exceptional approximation cap...

Full description

Saved in:
Bibliographic Details
Published in:Neural networks 2024-05, Vol.173, p.106202-106202, Article 106202
Main Authors: Xiao, Yuqi, Adegoke, Muideen, Leung, Chi-Sing, Leung, Kwok Wa
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-c311t-738da195228a94df8892147f85503c5e06b5e4b3534b03cdb0fa1a5aacca1a233
container_end_page 106202
container_issue
container_start_page 106202
container_title Neural networks
container_volume 173
creator Xiao, Yuqi
Adegoke, Muideen
Leung, Chi-Sing
Leung, Kwok Wa
description The concept of randomized neural networks (RNNs), such as the random vector functional link network (RVFL) and extreme learning machine (ELM), is a widely accepted and efficient network method for constructing single-hidden layer feedforward networks (SLFNs). Due to its exceptional approximation capabilities, RNN is being extensively used in various fields. While the RNN concept has shown great promise, its performance can be unpredictable in imperfect conditions, such as weight noises and outliers. Thus, there is a need to develop more reliable and robust RNN algorithms. To address this issue, this paper proposes a new objective function that addresses the combined effect of weight noise and training data outliers for RVFL networks. Based on the half-quadratic optimization method, we then propose a novel algorithm, named noise-aware RNN (NARNN), to optimize the proposed objective function. The convergence of the NARNN is also theoretically validated. We also discuss the way to use the NARNN for ensemble deep RVFL (edRVFL) networks. Finally, we present an extension of the NARNN to concurrently address weight noise, stuck-at-fault, and outliers. The experimental results demonstrate that the proposed algorithm outperforms a number of state-of-the-art robust RNN algorithms.
doi_str_mv 10.1016/j.neunet.2024.106202
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2934273545</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0893608024001266</els_id><sourcerecordid>2934273545</sourcerecordid><originalsourceid>FETCH-LOGICAL-c311t-738da195228a94df8892147f85503c5e06b5e4b3534b03cdb0fa1a5aacca1a233</originalsourceid><addsrcrecordid>eNp9kE1r3DAQhkVoyG62_Qel6NiLt_q05UugLM0HLBRKcshJyPJ4q61tbSU5ofn10eJNjj29MPPOvDMPQp8pWVNCy2_79QjTCGnNCBO5VGY9Q0uqqrpglWIf0JKomhclUWSBLmPcE0JKJfgFWnAlGFNcLtHjL99MMeHRuwiFeTYBsOl3Prj0e8CdDziYsfWDe4EW58Bg-izp2Yc_ODewSxFbPz5B2MFoAR-CP0BIDuJHdN6ZPsKnk67Qw_WP-81tsf15c7f5vi0spzQVFVetobXM95hatJ1SNaOi6pSUhFsJpGwkiIZLLppcaBvSGWqkMdZmZZyv0Nd5b47-O0FMenDRQt-bEfwUNau5YBWXQmarmK02-BgDdPoQ3GDCP02JPkLVez1D1Ueoeoaax76cEqZmgPZ96I1iNlzNBsh_PjkIOlp3xNG6ADbp1rv_J7wCE8SLTg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2934273545</pqid></control><display><type>article</type><title>Robust noise-aware algorithm for randomized neural network and its convergence properties</title><source>ScienceDirect Freedom Collection 2022-2024</source><creator>Xiao, Yuqi ; Adegoke, Muideen ; Leung, Chi-Sing ; Leung, Kwok Wa</creator><creatorcontrib>Xiao, Yuqi ; Adegoke, Muideen ; Leung, Chi-Sing ; Leung, Kwok Wa</creatorcontrib><description>The concept of randomized neural networks (RNNs), such as the random vector functional link network (RVFL) and extreme learning machine (ELM), is a widely accepted and efficient network method for constructing single-hidden layer feedforward networks (SLFNs). Due to its exceptional approximation capabilities, RNN is being extensively used in various fields. While the RNN concept has shown great promise, its performance can be unpredictable in imperfect conditions, such as weight noises and outliers. Thus, there is a need to develop more reliable and robust RNN algorithms. To address this issue, this paper proposes a new objective function that addresses the combined effect of weight noise and training data outliers for RVFL networks. Based on the half-quadratic optimization method, we then propose a novel algorithm, named noise-aware RNN (NARNN), to optimize the proposed objective function. The convergence of the NARNN is also theoretically validated. We also discuss the way to use the NARNN for ensemble deep RVFL (edRVFL) networks. Finally, we present an extension of the NARNN to concurrently address weight noise, stuck-at-fault, and outliers. The experimental results demonstrate that the proposed algorithm outperforms a number of state-of-the-art robust RNN algorithms.</description><identifier>ISSN: 0893-6080</identifier><identifier>EISSN: 1879-2782</identifier><identifier>DOI: 10.1016/j.neunet.2024.106202</identifier><identifier>PMID: 38422835</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Half-quadratic ; Network resilience ; Noise awareness ; Outlier samples ; Randomized neural network</subject><ispartof>Neural networks, 2024-05, Vol.173, p.106202-106202, Article 106202</ispartof><rights>2024 Elsevier Ltd</rights><rights>Copyright © 2024 Elsevier Ltd. All rights reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c311t-738da195228a94df8892147f85503c5e06b5e4b3534b03cdb0fa1a5aacca1a233</cites><orcidid>0000-0002-5621-1259</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/38422835$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Xiao, Yuqi</creatorcontrib><creatorcontrib>Adegoke, Muideen</creatorcontrib><creatorcontrib>Leung, Chi-Sing</creatorcontrib><creatorcontrib>Leung, Kwok Wa</creatorcontrib><title>Robust noise-aware algorithm for randomized neural network and its convergence properties</title><title>Neural networks</title><addtitle>Neural Netw</addtitle><description>The concept of randomized neural networks (RNNs), such as the random vector functional link network (RVFL) and extreme learning machine (ELM), is a widely accepted and efficient network method for constructing single-hidden layer feedforward networks (SLFNs). Due to its exceptional approximation capabilities, RNN is being extensively used in various fields. While the RNN concept has shown great promise, its performance can be unpredictable in imperfect conditions, such as weight noises and outliers. Thus, there is a need to develop more reliable and robust RNN algorithms. To address this issue, this paper proposes a new objective function that addresses the combined effect of weight noise and training data outliers for RVFL networks. Based on the half-quadratic optimization method, we then propose a novel algorithm, named noise-aware RNN (NARNN), to optimize the proposed objective function. The convergence of the NARNN is also theoretically validated. We also discuss the way to use the NARNN for ensemble deep RVFL (edRVFL) networks. Finally, we present an extension of the NARNN to concurrently address weight noise, stuck-at-fault, and outliers. The experimental results demonstrate that the proposed algorithm outperforms a number of state-of-the-art robust RNN algorithms.</description><subject>Half-quadratic</subject><subject>Network resilience</subject><subject>Noise awareness</subject><subject>Outlier samples</subject><subject>Randomized neural network</subject><issn>0893-6080</issn><issn>1879-2782</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNp9kE1r3DAQhkVoyG62_Qel6NiLt_q05UugLM0HLBRKcshJyPJ4q61tbSU5ofn10eJNjj29MPPOvDMPQp8pWVNCy2_79QjTCGnNCBO5VGY9Q0uqqrpglWIf0JKomhclUWSBLmPcE0JKJfgFWnAlGFNcLtHjL99MMeHRuwiFeTYBsOl3Prj0e8CdDziYsfWDe4EW58Bg-izp2Yc_ODewSxFbPz5B2MFoAR-CP0BIDuJHdN6ZPsKnk67Qw_WP-81tsf15c7f5vi0spzQVFVetobXM95hatJ1SNaOi6pSUhFsJpGwkiIZLLppcaBvSGWqkMdZmZZyv0Nd5b47-O0FMenDRQt-bEfwUNau5YBWXQmarmK02-BgDdPoQ3GDCP02JPkLVez1D1Ueoeoaax76cEqZmgPZ96I1iNlzNBsh_PjkIOlp3xNG6ADbp1rv_J7wCE8SLTg</recordid><startdate>20240501</startdate><enddate>20240501</enddate><creator>Xiao, Yuqi</creator><creator>Adegoke, Muideen</creator><creator>Leung, Chi-Sing</creator><creator>Leung, Kwok Wa</creator><general>Elsevier Ltd</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-5621-1259</orcidid></search><sort><creationdate>20240501</creationdate><title>Robust noise-aware algorithm for randomized neural network and its convergence properties</title><author>Xiao, Yuqi ; Adegoke, Muideen ; Leung, Chi-Sing ; Leung, Kwok Wa</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c311t-738da195228a94df8892147f85503c5e06b5e4b3534b03cdb0fa1a5aacca1a233</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Half-quadratic</topic><topic>Network resilience</topic><topic>Noise awareness</topic><topic>Outlier samples</topic><topic>Randomized neural network</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Xiao, Yuqi</creatorcontrib><creatorcontrib>Adegoke, Muideen</creatorcontrib><creatorcontrib>Leung, Chi-Sing</creatorcontrib><creatorcontrib>Leung, Kwok Wa</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Neural networks</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Xiao, Yuqi</au><au>Adegoke, Muideen</au><au>Leung, Chi-Sing</au><au>Leung, Kwok Wa</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Robust noise-aware algorithm for randomized neural network and its convergence properties</atitle><jtitle>Neural networks</jtitle><addtitle>Neural Netw</addtitle><date>2024-05-01</date><risdate>2024</risdate><volume>173</volume><spage>106202</spage><epage>106202</epage><pages>106202-106202</pages><artnum>106202</artnum><issn>0893-6080</issn><eissn>1879-2782</eissn><abstract>The concept of randomized neural networks (RNNs), such as the random vector functional link network (RVFL) and extreme learning machine (ELM), is a widely accepted and efficient network method for constructing single-hidden layer feedforward networks (SLFNs). Due to its exceptional approximation capabilities, RNN is being extensively used in various fields. While the RNN concept has shown great promise, its performance can be unpredictable in imperfect conditions, such as weight noises and outliers. Thus, there is a need to develop more reliable and robust RNN algorithms. To address this issue, this paper proposes a new objective function that addresses the combined effect of weight noise and training data outliers for RVFL networks. Based on the half-quadratic optimization method, we then propose a novel algorithm, named noise-aware RNN (NARNN), to optimize the proposed objective function. The convergence of the NARNN is also theoretically validated. We also discuss the way to use the NARNN for ensemble deep RVFL (edRVFL) networks. Finally, we present an extension of the NARNN to concurrently address weight noise, stuck-at-fault, and outliers. The experimental results demonstrate that the proposed algorithm outperforms a number of state-of-the-art robust RNN algorithms.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>38422835</pmid><doi>10.1016/j.neunet.2024.106202</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0002-5621-1259</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0893-6080
ispartof Neural networks, 2024-05, Vol.173, p.106202-106202, Article 106202
issn 0893-6080
1879-2782
language eng
recordid cdi_proquest_miscellaneous_2934273545
source ScienceDirect Freedom Collection 2022-2024
subjects Half-quadratic
Network resilience
Noise awareness
Outlier samples
Randomized neural network
title Robust noise-aware algorithm for randomized neural network and its convergence properties
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T04%3A26%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Robust%20noise-aware%20algorithm%20for%20randomized%20neural%20network%20and%20its%20convergence%20properties&rft.jtitle=Neural%20networks&rft.au=Xiao,%20Yuqi&rft.date=2024-05-01&rft.volume=173&rft.spage=106202&rft.epage=106202&rft.pages=106202-106202&rft.artnum=106202&rft.issn=0893-6080&rft.eissn=1879-2782&rft_id=info:doi/10.1016/j.neunet.2024.106202&rft_dat=%3Cproquest_cross%3E2934273545%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c311t-738da195228a94df8892147f85503c5e06b5e4b3534b03cdb0fa1a5aacca1a233%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2934273545&rft_id=info:pmid/38422835&rfr_iscdi=true