Loading…

TAPCA: time adaptive self-organizing maps for adaptive principal components analysis

We propose a neural network called time adaptive principal components analysis (TAPCA) which is composed of a number of time adaptive self-organizing map (TASOM) networks. Each TASOM in the TAPCA network estimates one eigenvector of the correlation matrix of the input vectors entered so far, without...

Full description

Saved in:
Bibliographic Details
Main Authors: Shah-Hosseini, H., Safabakhsh, R.
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 512 vol.1
container_issue
container_start_page 509
container_title
container_volume 1
creator Shah-Hosseini, H.
Safabakhsh, R.
description We propose a neural network called time adaptive principal components analysis (TAPCA) which is composed of a number of time adaptive self-organizing map (TASOM) networks. Each TASOM in the TAPCA network estimates one eigenvector of the correlation matrix of the input vectors entered so far, without having to calculate the correlation matrix. This estimation is done in an online fashion. The input distribution can be nonstationary, too. The eigenvectors appear in order of importance: the first TASOM calculates the eigenvector corresponding to the largest eigenvalue of the correlation matrix, and so on. The TAPCA network is tested in stationary environments, and is compared with the eigendecomposition (ED) method and generalized Hebbian algorithm (GHA) network. It performs better than both methods and needs fewer samples to converge. It is also tested in nonstationary environments, where it automatically tolerates translation, rotation, scaling, and a change in the shape of the distribution.
doi_str_mv 10.1109/ICIP.2001.959065
format conference_proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_959065</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>959065</ieee_id><sourcerecordid>959065</sourcerecordid><originalsourceid>FETCH-LOGICAL-i104t-547bf8ee282b3a09795d5ebe035372329f398a0f3197bb406d1402a999624ce63</originalsourceid><addsrcrecordid>eNpFj8tKxEAURBtEUMfZi6v-gcTbr3TaXQg-AgPOIq6Hm-T20JIX6SCMX-_ACNamoDgUVYw9CEiFAPdUldU-lQAidcZBZq7YHdgcVGalETdsG-MXnKWNPke3rK6LfVk88zUMxLHDeQ3fxCP1PpmWI47hJ4xHPuAcuZ-Wf2JewtiGGXveTsM8jTSukeOI_SmGeM-uPfaRtn--YZ-vL3X5nuw-3qqy2CVBgF4To23jcyKZy0YhOOtMZ6ghUEZZqaTzyuUIXglnm0ZD1gkNEp1zmdQtZWrDHi-9gYgO50kDLqfD5bf6Bfr-TdY</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>TAPCA: time adaptive self-organizing maps for adaptive principal components analysis</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Shah-Hosseini, H. ; Safabakhsh, R.</creator><creatorcontrib>Shah-Hosseini, H. ; Safabakhsh, R.</creatorcontrib><description>We propose a neural network called time adaptive principal components analysis (TAPCA) which is composed of a number of time adaptive self-organizing map (TASOM) networks. Each TASOM in the TAPCA network estimates one eigenvector of the correlation matrix of the input vectors entered so far, without having to calculate the correlation matrix. This estimation is done in an online fashion. The input distribution can be nonstationary, too. The eigenvectors appear in order of importance: the first TASOM calculates the eigenvector corresponding to the largest eigenvalue of the correlation matrix, and so on. The TAPCA network is tested in stationary environments, and is compared with the eigendecomposition (ED) method and generalized Hebbian algorithm (GHA) network. It performs better than both methods and needs fewer samples to converge. It is also tested in nonstationary environments, where it automatically tolerates translation, rotation, scaling, and a change in the shape of the distribution.</description><identifier>ISBN: 0780367251</identifier><identifier>ISBN: 9780780367258</identifier><identifier>DOI: 10.1109/ICIP.2001.959065</identifier><language>eng</language><publisher>IEEE</publisher><subject>Adaptive systems ; Computer networks ; Eigenvalues and eigenfunctions ; Neural networks ; Neurons ; Optical computing ; Principal component analysis ; Self organizing feature maps ; Shape ; Testing</subject><ispartof>Proceedings 2001 International Conference on Image Processing (Cat. No.01CH37205), 2001, Vol.1, p.509-512 vol.1</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/959065$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2058,4050,4051,27925,54920</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/959065$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Shah-Hosseini, H.</creatorcontrib><creatorcontrib>Safabakhsh, R.</creatorcontrib><title>TAPCA: time adaptive self-organizing maps for adaptive principal components analysis</title><title>Proceedings 2001 International Conference on Image Processing (Cat. No.01CH37205)</title><addtitle>ICIP</addtitle><description>We propose a neural network called time adaptive principal components analysis (TAPCA) which is composed of a number of time adaptive self-organizing map (TASOM) networks. Each TASOM in the TAPCA network estimates one eigenvector of the correlation matrix of the input vectors entered so far, without having to calculate the correlation matrix. This estimation is done in an online fashion. The input distribution can be nonstationary, too. The eigenvectors appear in order of importance: the first TASOM calculates the eigenvector corresponding to the largest eigenvalue of the correlation matrix, and so on. The TAPCA network is tested in stationary environments, and is compared with the eigendecomposition (ED) method and generalized Hebbian algorithm (GHA) network. It performs better than both methods and needs fewer samples to converge. It is also tested in nonstationary environments, where it automatically tolerates translation, rotation, scaling, and a change in the shape of the distribution.</description><subject>Adaptive systems</subject><subject>Computer networks</subject><subject>Eigenvalues and eigenfunctions</subject><subject>Neural networks</subject><subject>Neurons</subject><subject>Optical computing</subject><subject>Principal component analysis</subject><subject>Self organizing feature maps</subject><subject>Shape</subject><subject>Testing</subject><isbn>0780367251</isbn><isbn>9780780367258</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2001</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNpFj8tKxEAURBtEUMfZi6v-gcTbr3TaXQg-AgPOIq6Hm-T20JIX6SCMX-_ACNamoDgUVYw9CEiFAPdUldU-lQAidcZBZq7YHdgcVGalETdsG-MXnKWNPke3rK6LfVk88zUMxLHDeQ3fxCP1PpmWI47hJ4xHPuAcuZ-Wf2JewtiGGXveTsM8jTSukeOI_SmGeM-uPfaRtn--YZ-vL3X5nuw-3qqy2CVBgF4To23jcyKZy0YhOOtMZ6ghUEZZqaTzyuUIXglnm0ZD1gkNEp1zmdQtZWrDHi-9gYgO50kDLqfD5bf6Bfr-TdY</recordid><startdate>2001</startdate><enddate>2001</enddate><creator>Shah-Hosseini, H.</creator><creator>Safabakhsh, R.</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>2001</creationdate><title>TAPCA: time adaptive self-organizing maps for adaptive principal components analysis</title><author>Shah-Hosseini, H. ; Safabakhsh, R.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i104t-547bf8ee282b3a09795d5ebe035372329f398a0f3197bb406d1402a999624ce63</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2001</creationdate><topic>Adaptive systems</topic><topic>Computer networks</topic><topic>Eigenvalues and eigenfunctions</topic><topic>Neural networks</topic><topic>Neurons</topic><topic>Optical computing</topic><topic>Principal component analysis</topic><topic>Self organizing feature maps</topic><topic>Shape</topic><topic>Testing</topic><toplevel>online_resources</toplevel><creatorcontrib>Shah-Hosseini, H.</creatorcontrib><creatorcontrib>Safabakhsh, R.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Shah-Hosseini, H.</au><au>Safabakhsh, R.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>TAPCA: time adaptive self-organizing maps for adaptive principal components analysis</atitle><btitle>Proceedings 2001 International Conference on Image Processing (Cat. No.01CH37205)</btitle><stitle>ICIP</stitle><date>2001</date><risdate>2001</risdate><volume>1</volume><spage>509</spage><epage>512 vol.1</epage><pages>509-512 vol.1</pages><isbn>0780367251</isbn><isbn>9780780367258</isbn><abstract>We propose a neural network called time adaptive principal components analysis (TAPCA) which is composed of a number of time adaptive self-organizing map (TASOM) networks. Each TASOM in the TAPCA network estimates one eigenvector of the correlation matrix of the input vectors entered so far, without having to calculate the correlation matrix. This estimation is done in an online fashion. The input distribution can be nonstationary, too. The eigenvectors appear in order of importance: the first TASOM calculates the eigenvector corresponding to the largest eigenvalue of the correlation matrix, and so on. The TAPCA network is tested in stationary environments, and is compared with the eigendecomposition (ED) method and generalized Hebbian algorithm (GHA) network. It performs better than both methods and needs fewer samples to converge. It is also tested in nonstationary environments, where it automatically tolerates translation, rotation, scaling, and a change in the shape of the distribution.</abstract><pub>IEEE</pub><doi>10.1109/ICIP.2001.959065</doi></addata></record>
fulltext fulltext_linktorsrc
identifier ISBN: 0780367251
ispartof Proceedings 2001 International Conference on Image Processing (Cat. No.01CH37205), 2001, Vol.1, p.509-512 vol.1
issn
language eng
recordid cdi_ieee_primary_959065
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Adaptive systems
Computer networks
Eigenvalues and eigenfunctions
Neural networks
Neurons
Optical computing
Principal component analysis
Self organizing feature maps
Shape
Testing
title TAPCA: time adaptive self-organizing maps for adaptive principal components analysis
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-02T20%3A44%3A14IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=TAPCA:%20time%20adaptive%20self-organizing%20maps%20for%20adaptive%20principal%20components%20analysis&rft.btitle=Proceedings%202001%20International%20Conference%20on%20Image%20Processing%20(Cat.%20No.01CH37205)&rft.au=Shah-Hosseini,%20H.&rft.date=2001&rft.volume=1&rft.spage=509&rft.epage=512%20vol.1&rft.pages=509-512%20vol.1&rft.isbn=0780367251&rft.isbn_list=9780780367258&rft_id=info:doi/10.1109/ICIP.2001.959065&rft_dat=%3Cieee_6IE%3E959065%3C/ieee_6IE%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i104t-547bf8ee282b3a09795d5ebe035372329f398a0f3197bb406d1402a999624ce63%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=959065&rfr_iscdi=true