Loading…

New globally convergent training scheme based on the resilient propagation algorithm

In this paper, a new globally convergent modification of the Resilient Propagation-Rprop algorithm is presented. This new addition to the Rprop family of methods builds on a mathematical framework for the convergence analysis that ensures that the adaptive local learning rates of the Rprop's sc...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) 2005-03, Vol.64, p.253-270
Main Authors: Anastasiadis, Aristoklis D., Magoulas, George D., Vrahatis, Michael N.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c403t-d0582cd59e7ae0cac1fd5719806a9a9b885a5b7c4e8efc8e4292dd7e558c80683
cites cdi_FETCH-LOGICAL-c403t-d0582cd59e7ae0cac1fd5719806a9a9b885a5b7c4e8efc8e4292dd7e558c80683
container_end_page 270
container_issue
container_start_page 253
container_title Neurocomputing (Amsterdam)
container_volume 64
creator Anastasiadis, Aristoklis D.
Magoulas, George D.
Vrahatis, Michael N.
description In this paper, a new globally convergent modification of the Resilient Propagation-Rprop algorithm is presented. This new addition to the Rprop family of methods builds on a mathematical framework for the convergence analysis that ensures that the adaptive local learning rates of the Rprop's schedule generate a descent search direction at each iteration. Simulation results in six problems of the PROBEN1 benchmark collection show that the globally convergent modification of the Rprop algorithm exhibits improved learning speed, and compares favorably against the original Rprop and the Improved Rprop, a recently proposed Rrpop modification.
doi_str_mv 10.1016/j.neucom.2004.11.016
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_20630362</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0925231204005168</els_id><sourcerecordid>20630362</sourcerecordid><originalsourceid>FETCH-LOGICAL-c403t-d0582cd59e7ae0cac1fd5719806a9a9b885a5b7c4e8efc8e4292dd7e558c80683</originalsourceid><addsrcrecordid>eNp9kM1OwzAQhC0EEqXwBhxy4pZgO3HiXJBQxZ9UwaWcLcfepK6cuNhuEW-Pq3DmtNLsN6udQeiW4IJgUt_vigkOyo0FxbgqCCmSeIYWhDc055TX52iBW8pyWhJ6ia5C2GFMGkLbBdq8w3c2WNdJa38y5aYj-AGmmEUvzWSmIQtqCyNknQygMzdlcQuZh2CsOWF77_ZykNGkjbSD8yZux2t00Usb4OZvLtHn89Nm9ZqvP17eVo_rXFW4jLnGjFOlWQuNBKykIr1mDWk5rmUr245zJlnXqAo49IpDRVuqdQOMcZUYXi7R3Xw3ffF1gBDFaIICa-UE7hAExXWJy5omsJpB5V0IHnqx92aU_kcQLE4Vip2YKxSnCgUhIonJ9jDbIIU4GvAiqJRagTYeVBTamf8P_AJUXH3y</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>20630362</pqid></control><display><type>article</type><title>New globally convergent training scheme based on the resilient propagation algorithm</title><source>ScienceDirect Journals</source><creator>Anastasiadis, Aristoklis D. ; Magoulas, George D. ; Vrahatis, Michael N.</creator><creatorcontrib>Anastasiadis, Aristoklis D. ; Magoulas, George D. ; Vrahatis, Michael N.</creatorcontrib><description>In this paper, a new globally convergent modification of the Resilient Propagation-Rprop algorithm is presented. This new addition to the Rprop family of methods builds on a mathematical framework for the convergence analysis that ensures that the adaptive local learning rates of the Rprop's schedule generate a descent search direction at each iteration. Simulation results in six problems of the PROBEN1 benchmark collection show that the globally convergent modification of the Rprop algorithm exhibits improved learning speed, and compares favorably against the original Rprop and the Improved Rprop, a recently proposed Rrpop modification.</description><identifier>ISSN: 0925-2312</identifier><identifier>EISSN: 1872-8286</identifier><identifier>DOI: 10.1016/j.neucom.2004.11.016</identifier><language>eng</language><publisher>Elsevier B.V</publisher><subject>Batch learning ; Convergence analysis ; First-order training algorithms ; Global convergence property ; IRprop ; Rprop ; Supervised learning</subject><ispartof>Neurocomputing (Amsterdam), 2005-03, Vol.64, p.253-270</ispartof><rights>2005 Elsevier B.V.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c403t-d0582cd59e7ae0cac1fd5719806a9a9b885a5b7c4e8efc8e4292dd7e558c80683</citedby><cites>FETCH-LOGICAL-c403t-d0582cd59e7ae0cac1fd5719806a9a9b885a5b7c4e8efc8e4292dd7e558c80683</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Anastasiadis, Aristoklis D.</creatorcontrib><creatorcontrib>Magoulas, George D.</creatorcontrib><creatorcontrib>Vrahatis, Michael N.</creatorcontrib><title>New globally convergent training scheme based on the resilient propagation algorithm</title><title>Neurocomputing (Amsterdam)</title><description>In this paper, a new globally convergent modification of the Resilient Propagation-Rprop algorithm is presented. This new addition to the Rprop family of methods builds on a mathematical framework for the convergence analysis that ensures that the adaptive local learning rates of the Rprop's schedule generate a descent search direction at each iteration. Simulation results in six problems of the PROBEN1 benchmark collection show that the globally convergent modification of the Rprop algorithm exhibits improved learning speed, and compares favorably against the original Rprop and the Improved Rprop, a recently proposed Rrpop modification.</description><subject>Batch learning</subject><subject>Convergence analysis</subject><subject>First-order training algorithms</subject><subject>Global convergence property</subject><subject>IRprop</subject><subject>Rprop</subject><subject>Supervised learning</subject><issn>0925-2312</issn><issn>1872-8286</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2005</creationdate><recordtype>article</recordtype><recordid>eNp9kM1OwzAQhC0EEqXwBhxy4pZgO3HiXJBQxZ9UwaWcLcfepK6cuNhuEW-Pq3DmtNLsN6udQeiW4IJgUt_vigkOyo0FxbgqCCmSeIYWhDc055TX52iBW8pyWhJ6ia5C2GFMGkLbBdq8w3c2WNdJa38y5aYj-AGmmEUvzWSmIQtqCyNknQygMzdlcQuZh2CsOWF77_ZykNGkjbSD8yZux2t00Usb4OZvLtHn89Nm9ZqvP17eVo_rXFW4jLnGjFOlWQuNBKykIr1mDWk5rmUr245zJlnXqAo49IpDRVuqdQOMcZUYXi7R3Xw3ffF1gBDFaIICa-UE7hAExXWJy5omsJpB5V0IHnqx92aU_kcQLE4Vip2YKxSnCgUhIonJ9jDbIIU4GvAiqJRagTYeVBTamf8P_AJUXH3y</recordid><startdate>20050301</startdate><enddate>20050301</enddate><creator>Anastasiadis, Aristoklis D.</creator><creator>Magoulas, George D.</creator><creator>Vrahatis, Michael N.</creator><general>Elsevier B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7TK</scope></search><sort><creationdate>20050301</creationdate><title>New globally convergent training scheme based on the resilient propagation algorithm</title><author>Anastasiadis, Aristoklis D. ; Magoulas, George D. ; Vrahatis, Michael N.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c403t-d0582cd59e7ae0cac1fd5719806a9a9b885a5b7c4e8efc8e4292dd7e558c80683</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2005</creationdate><topic>Batch learning</topic><topic>Convergence analysis</topic><topic>First-order training algorithms</topic><topic>Global convergence property</topic><topic>IRprop</topic><topic>Rprop</topic><topic>Supervised learning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Anastasiadis, Aristoklis D.</creatorcontrib><creatorcontrib>Magoulas, George D.</creatorcontrib><creatorcontrib>Vrahatis, Michael N.</creatorcontrib><collection>CrossRef</collection><collection>Neurosciences Abstracts</collection><jtitle>Neurocomputing (Amsterdam)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Anastasiadis, Aristoklis D.</au><au>Magoulas, George D.</au><au>Vrahatis, Michael N.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>New globally convergent training scheme based on the resilient propagation algorithm</atitle><jtitle>Neurocomputing (Amsterdam)</jtitle><date>2005-03-01</date><risdate>2005</risdate><volume>64</volume><spage>253</spage><epage>270</epage><pages>253-270</pages><issn>0925-2312</issn><eissn>1872-8286</eissn><abstract>In this paper, a new globally convergent modification of the Resilient Propagation-Rprop algorithm is presented. This new addition to the Rprop family of methods builds on a mathematical framework for the convergence analysis that ensures that the adaptive local learning rates of the Rprop's schedule generate a descent search direction at each iteration. Simulation results in six problems of the PROBEN1 benchmark collection show that the globally convergent modification of the Rprop algorithm exhibits improved learning speed, and compares favorably against the original Rprop and the Improved Rprop, a recently proposed Rrpop modification.</abstract><pub>Elsevier B.V</pub><doi>10.1016/j.neucom.2004.11.016</doi><tpages>18</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0925-2312
ispartof Neurocomputing (Amsterdam), 2005-03, Vol.64, p.253-270
issn 0925-2312
1872-8286
language eng
recordid cdi_proquest_miscellaneous_20630362
source ScienceDirect Journals
subjects Batch learning
Convergence analysis
First-order training algorithms
Global convergence property
IRprop
Rprop
Supervised learning
title New globally convergent training scheme based on the resilient propagation algorithm
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T16%3A31%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=New%20globally%20convergent%20training%20scheme%20based%20on%20the%20resilient%20propagation%20algorithm&rft.jtitle=Neurocomputing%20(Amsterdam)&rft.au=Anastasiadis,%20Aristoklis%20D.&rft.date=2005-03-01&rft.volume=64&rft.spage=253&rft.epage=270&rft.pages=253-270&rft.issn=0925-2312&rft.eissn=1872-8286&rft_id=info:doi/10.1016/j.neucom.2004.11.016&rft_dat=%3Cproquest_cross%3E20630362%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c403t-d0582cd59e7ae0cac1fd5719806a9a9b885a5b7c4e8efc8e4292dd7e558c80683%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=20630362&rft_id=info:pmid/&rfr_iscdi=true