Loading…
Perlustration of error surfaces for nonlinear stochastic gradient descent algorithms
We attempt to explain in more detail the performance of several novel algorithms for nonlinear neural adaptive filtering. Weight trajectories together with the error surface give a clear understandable representation of the family of least mean square (LMS) based, nonlinear gradient descent (NGD), s...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 16 |
container_issue | |
container_start_page | 11 |
container_title | |
container_volume | |
creator | Hanna, A.I. Krcmar, I.R. Mandic, D.P. |
description | We attempt to explain in more detail the performance of several novel algorithms for nonlinear neural adaptive filtering. Weight trajectories together with the error surface give a clear understandable representation of the family of least mean square (LMS) based, nonlinear gradient descent (NGD), search-then-converge (STC) learning algorithms and the real-time recurrent learning (RTRL) algorithm. Performance is measured on prediction of coloured and nonlinear input. The results are an alternative qualitative representation of different qualitative performance measures for the analysed algorithms. Error surfaces and the adjacent instantaneous prediction errors support the analysis. |
doi_str_mv | 10.1109/NEUREL.2002.1057958 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_1057958</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>1057958</ieee_id><sourcerecordid>1057958</sourcerecordid><originalsourceid>FETCH-LOGICAL-i175t-fda3d65008a646230e81da11380d8f6669279da45fa8f43fe9fa9500cafb2cd03</originalsourceid><addsrcrecordid>eNotj9tKAzEYhAMiqLVP0Ju8wK5_NpvTpZT1AEsVaa_Lbw5tZLsrSXrh27ti5-ZjhmFgCFkxqBkD87Dpdh9dXzcATc1AKCP0FbkDpYErYbi5Icucv2AWN0K14pZs330azrkkLHEa6RSoT2lKNJ9TQOszDbMZp3GIo8c5LpM9Yi7R0kNCF_1YqPPZ_hGHw5RiOZ7yPbkOOGS_vHBBdk_ddv1S9W_Pr-vHvopMiVIFh9xJAaBRtrLh4DVzyBjX4HSQUppGGYetCKhDy4M3Ac1ctxg-G-uAL8jqfzd67_ffKZ4w_ewvx_kvZqhRLg</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Perlustration of error surfaces for nonlinear stochastic gradient descent algorithms</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Hanna, A.I. ; Krcmar, I.R. ; Mandic, D.P.</creator><creatorcontrib>Hanna, A.I. ; Krcmar, I.R. ; Mandic, D.P.</creatorcontrib><description>We attempt to explain in more detail the performance of several novel algorithms for nonlinear neural adaptive filtering. Weight trajectories together with the error surface give a clear understandable representation of the family of least mean square (LMS) based, nonlinear gradient descent (NGD), search-then-converge (STC) learning algorithms and the real-time recurrent learning (RTRL) algorithm. Performance is measured on prediction of coloured and nonlinear input. The results are an alternative qualitative representation of different qualitative performance measures for the analysed algorithms. Error surfaces and the adjacent instantaneous prediction errors support the analysis.</description><identifier>ISBN: 0780375939</identifier><identifier>ISBN: 9780780375932</identifier><identifier>DOI: 10.1109/NEUREL.2002.1057958</identifier><language>eng</language><publisher>IEEE</publisher><subject>Algorithm design and analysis ; Backpropagation algorithms ; Filters ; Information systems ; Least squares approximation ; Monte Carlo methods ; Performance analysis ; Signal processing algorithms ; Stochastic processes ; Visualization</subject><ispartof>6th Seminar on Neural Network Applications in Electrical Engineering, 2002, p.11-16</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/1057958$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,776,780,785,786,2052,4036,4037,27902,54895</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/1057958$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Hanna, A.I.</creatorcontrib><creatorcontrib>Krcmar, I.R.</creatorcontrib><creatorcontrib>Mandic, D.P.</creatorcontrib><title>Perlustration of error surfaces for nonlinear stochastic gradient descent algorithms</title><title>6th Seminar on Neural Network Applications in Electrical Engineering</title><addtitle>NEUREL</addtitle><description>We attempt to explain in more detail the performance of several novel algorithms for nonlinear neural adaptive filtering. Weight trajectories together with the error surface give a clear understandable representation of the family of least mean square (LMS) based, nonlinear gradient descent (NGD), search-then-converge (STC) learning algorithms and the real-time recurrent learning (RTRL) algorithm. Performance is measured on prediction of coloured and nonlinear input. The results are an alternative qualitative representation of different qualitative performance measures for the analysed algorithms. Error surfaces and the adjacent instantaneous prediction errors support the analysis.</description><subject>Algorithm design and analysis</subject><subject>Backpropagation algorithms</subject><subject>Filters</subject><subject>Information systems</subject><subject>Least squares approximation</subject><subject>Monte Carlo methods</subject><subject>Performance analysis</subject><subject>Signal processing algorithms</subject><subject>Stochastic processes</subject><subject>Visualization</subject><isbn>0780375939</isbn><isbn>9780780375932</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2002</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNotj9tKAzEYhAMiqLVP0Ju8wK5_NpvTpZT1AEsVaa_Lbw5tZLsrSXrh27ti5-ZjhmFgCFkxqBkD87Dpdh9dXzcATc1AKCP0FbkDpYErYbi5Icucv2AWN0K14pZs330azrkkLHEa6RSoT2lKNJ9TQOszDbMZp3GIo8c5LpM9Yi7R0kNCF_1YqPPZ_hGHw5RiOZ7yPbkOOGS_vHBBdk_ddv1S9W_Pr-vHvopMiVIFh9xJAaBRtrLh4DVzyBjX4HSQUppGGYetCKhDy4M3Ac1ctxg-G-uAL8jqfzd67_ffKZ4w_ewvx_kvZqhRLg</recordid><startdate>2002</startdate><enddate>2002</enddate><creator>Hanna, A.I.</creator><creator>Krcmar, I.R.</creator><creator>Mandic, D.P.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>2002</creationdate><title>Perlustration of error surfaces for nonlinear stochastic gradient descent algorithms</title><author>Hanna, A.I. ; Krcmar, I.R. ; Mandic, D.P.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i175t-fda3d65008a646230e81da11380d8f6669279da45fa8f43fe9fa9500cafb2cd03</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2002</creationdate><topic>Algorithm design and analysis</topic><topic>Backpropagation algorithms</topic><topic>Filters</topic><topic>Information systems</topic><topic>Least squares approximation</topic><topic>Monte Carlo methods</topic><topic>Performance analysis</topic><topic>Signal processing algorithms</topic><topic>Stochastic processes</topic><topic>Visualization</topic><toplevel>online_resources</toplevel><creatorcontrib>Hanna, A.I.</creatorcontrib><creatorcontrib>Krcmar, I.R.</creatorcontrib><creatorcontrib>Mandic, D.P.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Hanna, A.I.</au><au>Krcmar, I.R.</au><au>Mandic, D.P.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Perlustration of error surfaces for nonlinear stochastic gradient descent algorithms</atitle><btitle>6th Seminar on Neural Network Applications in Electrical Engineering</btitle><stitle>NEUREL</stitle><date>2002</date><risdate>2002</risdate><spage>11</spage><epage>16</epage><pages>11-16</pages><isbn>0780375939</isbn><isbn>9780780375932</isbn><abstract>We attempt to explain in more detail the performance of several novel algorithms for nonlinear neural adaptive filtering. Weight trajectories together with the error surface give a clear understandable representation of the family of least mean square (LMS) based, nonlinear gradient descent (NGD), search-then-converge (STC) learning algorithms and the real-time recurrent learning (RTRL) algorithm. Performance is measured on prediction of coloured and nonlinear input. The results are an alternative qualitative representation of different qualitative performance measures for the analysed algorithms. Error surfaces and the adjacent instantaneous prediction errors support the analysis.</abstract><pub>IEEE</pub><doi>10.1109/NEUREL.2002.1057958</doi><tpages>6</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISBN: 0780375939 |
ispartof | 6th Seminar on Neural Network Applications in Electrical Engineering, 2002, p.11-16 |
issn | |
language | eng |
recordid | cdi_ieee_primary_1057958 |
source | IEEE Electronic Library (IEL) Conference Proceedings |
subjects | Algorithm design and analysis Backpropagation algorithms Filters Information systems Least squares approximation Monte Carlo methods Performance analysis Signal processing algorithms Stochastic processes Visualization |
title | Perlustration of error surfaces for nonlinear stochastic gradient descent algorithms |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-31T03%3A00%3A52IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Perlustration%20of%20error%20surfaces%20for%20nonlinear%20stochastic%20gradient%20descent%20algorithms&rft.btitle=6th%20Seminar%20on%20Neural%20Network%20Applications%20in%20Electrical%20Engineering&rft.au=Hanna,%20A.I.&rft.date=2002&rft.spage=11&rft.epage=16&rft.pages=11-16&rft.isbn=0780375939&rft.isbn_list=9780780375932&rft_id=info:doi/10.1109/NEUREL.2002.1057958&rft_dat=%3Cieee_6IE%3E1057958%3C/ieee_6IE%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i175t-fda3d65008a646230e81da11380d8f6669279da45fa8f43fe9fa9500cafb2cd03%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=1057958&rfr_iscdi=true |