Loading…
Fast training of multilayer perceptrons with a mixed norm algorithm
A new fast training algorithm for the multilayer perceptron (MLP) is proposed. This new algorithm is based on the optimization of a mixed least square (LS) and a least fourth (LF) criterion producing a modified form of the standard back propagation algorithm (SBP). To determine the updating rules in...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 1022 vol. 2 |
container_issue | |
container_start_page | 1018 |
container_title | |
container_volume | 2 |
creator | Abid, S. Fnaiech, F. Jervis, B.W. Cheriet, M. |
description | A new fast training algorithm for the multilayer perceptron (MLP) is proposed. This new algorithm is based on the optimization of a mixed least square (LS) and a least fourth (LF) criterion producing a modified form of the standard back propagation algorithm (SBP). To determine the updating rules in the hidden layers, an analogous back propagation strategy used in the conventional learning algorithms is developed. This permits the application of the learning procedure to all the layers. Experimental results on benchmark applications and a real medical problem are obtained which indicates significant reduction in the total number of iterations, the convergence time, and the generalization capacity when compared to those of the SBP algorithm. |
doi_str_mv | 10.1109/IJCNN.2005.1555992 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_1555992</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>1555992</ieee_id><sourcerecordid>1555992</sourcerecordid><originalsourceid>FETCH-LOGICAL-i175t-e82449b624a01318d6da5ced14353a13bd3f809fc8963614e52ee789e6c6a9323</originalsourceid><addsrcrecordid>eNo1j81KAzEURoM_YFt9Ad3kBWa8yU0yyVIGWyulbnRd0pk7NTJ_ZCLat1ewrj44Bw58jN0KyIUAd79-LrfbXALoXGitnZNnbCaFEZlSUJyzORQW0IGy8uJfoMMrNp-mDwCJzuGMlUs_JZ6iD33oD3xoePfZptD6I0U-UqxoTHHoJ_4V0jv3vAvfVPN-iB337WGIv7S7ZpeNbye6Oe2CvS0fX8unbPOyWpcPmyyIQqeMrFTK7Y1UHgQKW5va64pqoVCjF7ivsbHgmso6g0Yo0pKosI5MZbxDiQt299cNRLQbY-h8PO5O5_EHtNpMBg</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Fast training of multilayer perceptrons with a mixed norm algorithm</title><source>IEEE Xplore All Conference Series</source><creator>Abid, S. ; Fnaiech, F. ; Jervis, B.W. ; Cheriet, M.</creator><creatorcontrib>Abid, S. ; Fnaiech, F. ; Jervis, B.W. ; Cheriet, M.</creatorcontrib><description>A new fast training algorithm for the multilayer perceptron (MLP) is proposed. This new algorithm is based on the optimization of a mixed least square (LS) and a least fourth (LF) criterion producing a modified form of the standard back propagation algorithm (SBP). To determine the updating rules in the hidden layers, an analogous back propagation strategy used in the conventional learning algorithms is developed. This permits the application of the learning procedure to all the layers. Experimental results on benchmark applications and a real medical problem are obtained which indicates significant reduction in the total number of iterations, the convergence time, and the generalization capacity when compared to those of the SBP algorithm.</description><identifier>ISSN: 2161-4393</identifier><identifier>ISBN: 0780390482</identifier><identifier>ISBN: 9780780390485</identifier><identifier>EISSN: 2161-4407</identifier><identifier>DOI: 10.1109/IJCNN.2005.1555992</identifier><language>eng</language><publisher>IEEE</publisher><subject>Artificial intelligence ; Back ; Backpropagation algorithms ; Biomedical imaging ; Convergence ; Laboratories ; Least squares approximation ; Least squares methods ; Multilayer perceptrons ; Neurons</subject><ispartof>Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005, 2005, Vol.2, p.1018-1022 vol. 2</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/1555992$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2058,4050,4051,27925,54555,54920,54932</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/1555992$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Abid, S.</creatorcontrib><creatorcontrib>Fnaiech, F.</creatorcontrib><creatorcontrib>Jervis, B.W.</creatorcontrib><creatorcontrib>Cheriet, M.</creatorcontrib><title>Fast training of multilayer perceptrons with a mixed norm algorithm</title><title>Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005</title><addtitle>IJCNN</addtitle><description>A new fast training algorithm for the multilayer perceptron (MLP) is proposed. This new algorithm is based on the optimization of a mixed least square (LS) and a least fourth (LF) criterion producing a modified form of the standard back propagation algorithm (SBP). To determine the updating rules in the hidden layers, an analogous back propagation strategy used in the conventional learning algorithms is developed. This permits the application of the learning procedure to all the layers. Experimental results on benchmark applications and a real medical problem are obtained which indicates significant reduction in the total number of iterations, the convergence time, and the generalization capacity when compared to those of the SBP algorithm.</description><subject>Artificial intelligence</subject><subject>Back</subject><subject>Backpropagation algorithms</subject><subject>Biomedical imaging</subject><subject>Convergence</subject><subject>Laboratories</subject><subject>Least squares approximation</subject><subject>Least squares methods</subject><subject>Multilayer perceptrons</subject><subject>Neurons</subject><issn>2161-4393</issn><issn>2161-4407</issn><isbn>0780390482</isbn><isbn>9780780390485</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2005</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNo1j81KAzEURoM_YFt9Ad3kBWa8yU0yyVIGWyulbnRd0pk7NTJ_ZCLat1ewrj44Bw58jN0KyIUAd79-LrfbXALoXGitnZNnbCaFEZlSUJyzORQW0IGy8uJfoMMrNp-mDwCJzuGMlUs_JZ6iD33oD3xoePfZptD6I0U-UqxoTHHoJ_4V0jv3vAvfVPN-iB337WGIv7S7ZpeNbye6Oe2CvS0fX8unbPOyWpcPmyyIQqeMrFTK7Y1UHgQKW5va64pqoVCjF7ivsbHgmso6g0Yo0pKosI5MZbxDiQt299cNRLQbY-h8PO5O5_EHtNpMBg</recordid><startdate>2005</startdate><enddate>2005</enddate><creator>Abid, S.</creator><creator>Fnaiech, F.</creator><creator>Jervis, B.W.</creator><creator>Cheriet, M.</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>2005</creationdate><title>Fast training of multilayer perceptrons with a mixed norm algorithm</title><author>Abid, S. ; Fnaiech, F. ; Jervis, B.W. ; Cheriet, M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i175t-e82449b624a01318d6da5ced14353a13bd3f809fc8963614e52ee789e6c6a9323</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2005</creationdate><topic>Artificial intelligence</topic><topic>Back</topic><topic>Backpropagation algorithms</topic><topic>Biomedical imaging</topic><topic>Convergence</topic><topic>Laboratories</topic><topic>Least squares approximation</topic><topic>Least squares methods</topic><topic>Multilayer perceptrons</topic><topic>Neurons</topic><toplevel>online_resources</toplevel><creatorcontrib>Abid, S.</creatorcontrib><creatorcontrib>Fnaiech, F.</creatorcontrib><creatorcontrib>Jervis, B.W.</creatorcontrib><creatorcontrib>Cheriet, M.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEL</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Abid, S.</au><au>Fnaiech, F.</au><au>Jervis, B.W.</au><au>Cheriet, M.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Fast training of multilayer perceptrons with a mixed norm algorithm</atitle><btitle>Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005</btitle><stitle>IJCNN</stitle><date>2005</date><risdate>2005</risdate><volume>2</volume><spage>1018</spage><epage>1022 vol. 2</epage><pages>1018-1022 vol. 2</pages><issn>2161-4393</issn><eissn>2161-4407</eissn><isbn>0780390482</isbn><isbn>9780780390485</isbn><abstract>A new fast training algorithm for the multilayer perceptron (MLP) is proposed. This new algorithm is based on the optimization of a mixed least square (LS) and a least fourth (LF) criterion producing a modified form of the standard back propagation algorithm (SBP). To determine the updating rules in the hidden layers, an analogous back propagation strategy used in the conventional learning algorithms is developed. This permits the application of the learning procedure to all the layers. Experimental results on benchmark applications and a real medical problem are obtained which indicates significant reduction in the total number of iterations, the convergence time, and the generalization capacity when compared to those of the SBP algorithm.</abstract><pub>IEEE</pub><doi>10.1109/IJCNN.2005.1555992</doi></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2161-4393 |
ispartof | Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005, 2005, Vol.2, p.1018-1022 vol. 2 |
issn | 2161-4393 2161-4407 |
language | eng |
recordid | cdi_ieee_primary_1555992 |
source | IEEE Xplore All Conference Series |
subjects | Artificial intelligence Back Backpropagation algorithms Biomedical imaging Convergence Laboratories Least squares approximation Least squares methods Multilayer perceptrons Neurons |
title | Fast training of multilayer perceptrons with a mixed norm algorithm |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-26T02%3A03%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Fast%20training%20of%20multilayer%20perceptrons%20with%20a%20mixed%20norm%20algorithm&rft.btitle=Proceedings.%202005%20IEEE%20International%20Joint%20Conference%20on%20Neural%20Networks,%202005&rft.au=Abid,%20S.&rft.date=2005&rft.volume=2&rft.spage=1018&rft.epage=1022%20vol.%202&rft.pages=1018-1022%20vol.%202&rft.issn=2161-4393&rft.eissn=2161-4407&rft.isbn=0780390482&rft.isbn_list=9780780390485&rft_id=info:doi/10.1109/IJCNN.2005.1555992&rft_dat=%3Cieee_CHZPO%3E1555992%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i175t-e82449b624a01318d6da5ced14353a13bd3f809fc8963614e52ee789e6c6a9323%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=1555992&rfr_iscdi=true |