Loading…
The common vector approach and its relation to principal component analysis
The main point of the paper is to show the close relation between the nonzero principal components and the difference subspace together with the complementary close relation between the zero principal components and the common vector. A common vector representing each word-class is obtained from the...
Saved in:
Published in: | IEEE transactions on speech and audio processing 2001-09, Vol.9 (6), p.655-662 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c431t-a87479735f37676a0b27a353e7f4ded8fbce86e859ade538583bf0830be0c3c3 |
---|---|
cites | cdi_FETCH-LOGICAL-c431t-a87479735f37676a0b27a353e7f4ded8fbce86e859ade538583bf0830be0c3c3 |
container_end_page | 662 |
container_issue | 6 |
container_start_page | 655 |
container_title | IEEE transactions on speech and audio processing |
container_volume | 9 |
creator | Gulmezoglu, M.B. Dzhafarov, V. Barkana, A. |
description | The main point of the paper is to show the close relation between the nonzero principal components and the difference subspace together with the complementary close relation between the zero principal components and the common vector. A common vector representing each word-class is obtained from the eigenvectors of the covariance matrix of its own word-class; that is, the common vector is in the direction of a linear combination of the eigenvectors corresponding to the zero eigenvalues of the covariance matrix. The methods that use the nonzero principal components for recognition purposes suggest the elimination of all the features that are in the direction of the eigenvectors corresponding to the smallest eigenvalues (including the zero eigenvalues) of the covariance matrix whereas the common vector approach suggests the elimination of all the features that are in the direction of the eigenvectors corresponding to the largest, all nonzero eigenvalues of the covariance matrix. |
doi_str_mv | 10.1109/89.943343 |
format | article |
fullrecord | <record><control><sourceid>proquest_pasca</sourceid><recordid>TN_cdi_proquest_miscellaneous_28650612</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>943343</ieee_id><sourcerecordid>2431644511</sourcerecordid><originalsourceid>FETCH-LOGICAL-c431t-a87479735f37676a0b27a353e7f4ded8fbce86e859ade538583bf0830be0c3c3</originalsourceid><addsrcrecordid>eNp90M9LwzAUB_AgCs7pwaunIqJ46EyaJnk5yvAXDrzsXtL0lXV0TU06Yf-9GR0KHjwlkM97fPMl5JLRGWNUP4Ce6ZzznB-RCRMC0owLfhzvVPJUSiVPyVkIa0opMJVPyPtyhYl1m43rki-0g_OJ6XvvjF0lpquSZgiJx9YMTQSDS3rfdLbpTbsf6l2H3RCdaXehCefkpDZtwIvDOSXL56fl_DVdfLy8zR8Xqc05G1IDKldacVFzFQMZWmbKxJSo6rzCCurSIkgEoU2FgoMAXtYUOC2RWm75lNyNa2PMzy2Godg0wWLbmg7dNhSaac0ZSBXl7b8yAymoZFmE13_g2m19_FYoAHLJgEka0f2IrHcheKyLWMbG-F3BaLEvvwBdjOVHe3NYaII1be1NbC38DjCeCb0PeDWyBhF_Xg87vgEAg4rj</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>884618160</pqid></control><display><type>article</type><title>The common vector approach and its relation to principal component analysis</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Gulmezoglu, M.B. ; Dzhafarov, V. ; Barkana, A.</creator><creatorcontrib>Gulmezoglu, M.B. ; Dzhafarov, V. ; Barkana, A.</creatorcontrib><description>The main point of the paper is to show the close relation between the nonzero principal components and the difference subspace together with the complementary close relation between the zero principal components and the common vector. A common vector representing each word-class is obtained from the eigenvectors of the covariance matrix of its own word-class; that is, the common vector is in the direction of a linear combination of the eigenvectors corresponding to the zero eigenvalues of the covariance matrix. The methods that use the nonzero principal components for recognition purposes suggest the elimination of all the features that are in the direction of the eigenvectors corresponding to the smallest eigenvalues (including the zero eigenvalues) of the covariance matrix whereas the common vector approach suggests the elimination of all the features that are in the direction of the eigenvectors corresponding to the largest, all nonzero eigenvalues of the covariance matrix.</description><identifier>ISSN: 1063-6676</identifier><identifier>ISSN: 2329-9290</identifier><identifier>EISSN: 1558-2353</identifier><identifier>EISSN: 2329-9304</identifier><identifier>DOI: 10.1109/89.943343</identifier><identifier>CODEN: IESPEJ</identifier><language>eng</language><publisher>New York, NY: IEEE</publisher><subject>Applied sciences ; Covariance matrix ; Eigenvalues ; Eigenvalues and eigenfunctions ; Eigenvectors ; Equations ; Exact sciences and technology ; Feature recognition ; Information, signal and communications theory ; Loudspeakers ; Mathematical analysis ; Mathematics ; Principal component analysis ; Principal components analysis ; Recognition ; Signal processing ; Speech ; Speech processing ; Speech recognition ; Studies ; Telecommunications and information theory ; Testing ; Two dimensional displays ; Vectors ; Vectors (mathematics)</subject><ispartof>IEEE transactions on speech and audio processing, 2001-09, Vol.9 (6), p.655-662</ispartof><rights>2001 INIST-CNRS</rights><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2001</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c431t-a87479735f37676a0b27a353e7f4ded8fbce86e859ade538583bf0830be0c3c3</citedby><cites>FETCH-LOGICAL-c431t-a87479735f37676a0b27a353e7f4ded8fbce86e859ade538583bf0830be0c3c3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/943343$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,27923,27924,54795</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=1132597$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Gulmezoglu, M.B.</creatorcontrib><creatorcontrib>Dzhafarov, V.</creatorcontrib><creatorcontrib>Barkana, A.</creatorcontrib><title>The common vector approach and its relation to principal component analysis</title><title>IEEE transactions on speech and audio processing</title><addtitle>T-SAP</addtitle><description>The main point of the paper is to show the close relation between the nonzero principal components and the difference subspace together with the complementary close relation between the zero principal components and the common vector. A common vector representing each word-class is obtained from the eigenvectors of the covariance matrix of its own word-class; that is, the common vector is in the direction of a linear combination of the eigenvectors corresponding to the zero eigenvalues of the covariance matrix. The methods that use the nonzero principal components for recognition purposes suggest the elimination of all the features that are in the direction of the eigenvectors corresponding to the smallest eigenvalues (including the zero eigenvalues) of the covariance matrix whereas the common vector approach suggests the elimination of all the features that are in the direction of the eigenvectors corresponding to the largest, all nonzero eigenvalues of the covariance matrix.</description><subject>Applied sciences</subject><subject>Covariance matrix</subject><subject>Eigenvalues</subject><subject>Eigenvalues and eigenfunctions</subject><subject>Eigenvectors</subject><subject>Equations</subject><subject>Exact sciences and technology</subject><subject>Feature recognition</subject><subject>Information, signal and communications theory</subject><subject>Loudspeakers</subject><subject>Mathematical analysis</subject><subject>Mathematics</subject><subject>Principal component analysis</subject><subject>Principal components analysis</subject><subject>Recognition</subject><subject>Signal processing</subject><subject>Speech</subject><subject>Speech processing</subject><subject>Speech recognition</subject><subject>Studies</subject><subject>Telecommunications and information theory</subject><subject>Testing</subject><subject>Two dimensional displays</subject><subject>Vectors</subject><subject>Vectors (mathematics)</subject><issn>1063-6676</issn><issn>2329-9290</issn><issn>1558-2353</issn><issn>2329-9304</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2001</creationdate><recordtype>article</recordtype><recordid>eNp90M9LwzAUB_AgCs7pwaunIqJ46EyaJnk5yvAXDrzsXtL0lXV0TU06Yf-9GR0KHjwlkM97fPMl5JLRGWNUP4Ce6ZzznB-RCRMC0owLfhzvVPJUSiVPyVkIa0opMJVPyPtyhYl1m43rki-0g_OJ6XvvjF0lpquSZgiJx9YMTQSDS3rfdLbpTbsf6l2H3RCdaXehCefkpDZtwIvDOSXL56fl_DVdfLy8zR8Xqc05G1IDKldacVFzFQMZWmbKxJSo6rzCCurSIkgEoU2FgoMAXtYUOC2RWm75lNyNa2PMzy2Godg0wWLbmg7dNhSaac0ZSBXl7b8yAymoZFmE13_g2m19_FYoAHLJgEka0f2IrHcheKyLWMbG-F3BaLEvvwBdjOVHe3NYaII1be1NbC38DjCeCb0PeDWyBhF_Xg87vgEAg4rj</recordid><startdate>20010901</startdate><enddate>20010901</enddate><creator>Gulmezoglu, M.B.</creator><creator>Dzhafarov, V.</creator><creator>Barkana, A.</creator><general>IEEE</general><general>Institute of Electrical and Electronics Engineers</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>RIA</scope><scope>RIE</scope><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7SP</scope><scope>F28</scope><scope>FR3</scope></search><sort><creationdate>20010901</creationdate><title>The common vector approach and its relation to principal component analysis</title><author>Gulmezoglu, M.B. ; Dzhafarov, V. ; Barkana, A.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c431t-a87479735f37676a0b27a353e7f4ded8fbce86e859ade538583bf0830be0c3c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2001</creationdate><topic>Applied sciences</topic><topic>Covariance matrix</topic><topic>Eigenvalues</topic><topic>Eigenvalues and eigenfunctions</topic><topic>Eigenvectors</topic><topic>Equations</topic><topic>Exact sciences and technology</topic><topic>Feature recognition</topic><topic>Information, signal and communications theory</topic><topic>Loudspeakers</topic><topic>Mathematical analysis</topic><topic>Mathematics</topic><topic>Principal component analysis</topic><topic>Principal components analysis</topic><topic>Recognition</topic><topic>Signal processing</topic><topic>Speech</topic><topic>Speech processing</topic><topic>Speech recognition</topic><topic>Studies</topic><topic>Telecommunications and information theory</topic><topic>Testing</topic><topic>Two dimensional displays</topic><topic>Vectors</topic><topic>Vectors (mathematics)</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Gulmezoglu, M.B.</creatorcontrib><creatorcontrib>Dzhafarov, V.</creatorcontrib><creatorcontrib>Barkana, A.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Electronics & Communications Abstracts</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><jtitle>IEEE transactions on speech and audio processing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Gulmezoglu, M.B.</au><au>Dzhafarov, V.</au><au>Barkana, A.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The common vector approach and its relation to principal component analysis</atitle><jtitle>IEEE transactions on speech and audio processing</jtitle><stitle>T-SAP</stitle><date>2001-09-01</date><risdate>2001</risdate><volume>9</volume><issue>6</issue><spage>655</spage><epage>662</epage><pages>655-662</pages><issn>1063-6676</issn><issn>2329-9290</issn><eissn>1558-2353</eissn><eissn>2329-9304</eissn><coden>IESPEJ</coden><abstract>The main point of the paper is to show the close relation between the nonzero principal components and the difference subspace together with the complementary close relation between the zero principal components and the common vector. A common vector representing each word-class is obtained from the eigenvectors of the covariance matrix of its own word-class; that is, the common vector is in the direction of a linear combination of the eigenvectors corresponding to the zero eigenvalues of the covariance matrix. The methods that use the nonzero principal components for recognition purposes suggest the elimination of all the features that are in the direction of the eigenvectors corresponding to the smallest eigenvalues (including the zero eigenvalues) of the covariance matrix whereas the common vector approach suggests the elimination of all the features that are in the direction of the eigenvectors corresponding to the largest, all nonzero eigenvalues of the covariance matrix.</abstract><cop>New York, NY</cop><pub>IEEE</pub><doi>10.1109/89.943343</doi><tpages>8</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1063-6676 |
ispartof | IEEE transactions on speech and audio processing, 2001-09, Vol.9 (6), p.655-662 |
issn | 1063-6676 2329-9290 1558-2353 2329-9304 |
language | eng |
recordid | cdi_proquest_miscellaneous_28650612 |
source | IEEE Electronic Library (IEL) Journals |
subjects | Applied sciences Covariance matrix Eigenvalues Eigenvalues and eigenfunctions Eigenvectors Equations Exact sciences and technology Feature recognition Information, signal and communications theory Loudspeakers Mathematical analysis Mathematics Principal component analysis Principal components analysis Recognition Signal processing Speech Speech processing Speech recognition Studies Telecommunications and information theory Testing Two dimensional displays Vectors Vectors (mathematics) |
title | The common vector approach and its relation to principal component analysis |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T16%3A16%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pasca&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20common%20vector%20approach%20and%20its%20relation%20to%20principal%20component%20analysis&rft.jtitle=IEEE%20transactions%20on%20speech%20and%20audio%20processing&rft.au=Gulmezoglu,%20M.B.&rft.date=2001-09-01&rft.volume=9&rft.issue=6&rft.spage=655&rft.epage=662&rft.pages=655-662&rft.issn=1063-6676&rft.eissn=1558-2353&rft.coden=IESPEJ&rft_id=info:doi/10.1109/89.943343&rft_dat=%3Cproquest_pasca%3E2431644511%3C/proquest_pasca%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c431t-a87479735f37676a0b27a353e7f4ded8fbce86e859ade538583bf0830be0c3c3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=884618160&rft_id=info:pmid/&rft_ieee_id=943343&rfr_iscdi=true |