Loading…

Robust Dimension Reduction for Clustering With Local Adaptive Learning

In pattern recognition and data mining, clustering is a classical technique to group matters of interest and has been widely employed to numerous applications. Among various clustering algorithms, K-means (KM) clustering is most popular for its simplicity and efficiency. However, with the rapid deve...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transaction on neural networks and learning systems 2019-03, Vol.30 (3), p.657-669
Main Authors: Wang, Xiao-Dong, Chen, Rung-Ching, Zeng, Zhi-Qiang, Hong, Chao-Qun, Yan, Fei
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c351t-6b6ec550ef5699ee31d77e8bae0ba6a06c5433cc1b799c1ae03f4c32e44a847c3
cites cdi_FETCH-LOGICAL-c351t-6b6ec550ef5699ee31d77e8bae0ba6a06c5433cc1b799c1ae03f4c32e44a847c3
container_end_page 669
container_issue 3
container_start_page 657
container_title IEEE transaction on neural networks and learning systems
container_volume 30
creator Wang, Xiao-Dong
Chen, Rung-Ching
Zeng, Zhi-Qiang
Hong, Chao-Qun
Yan, Fei
description In pattern recognition and data mining, clustering is a classical technique to group matters of interest and has been widely employed to numerous applications. Among various clustering algorithms, K-means (KM) clustering is most popular for its simplicity and efficiency. However, with the rapid development of the social network, high-dimensional data are frequently generated, which poses a considerable challenge to the traditional KM clustering as the curse of dimensionality. In such scenarios, it is difficult to directly cluster such high-dimensional data that always contain redundant features and noises. Although the existing approaches try to solve this problem using joint subspace learning and KM clustering, there are still the following limitations: 1) the discriminative information in low-dimensional subspace is not well captured; 2) the intrinsic geometric information is seldom considered; and 3) the optimizing procedure of a discrete cluster indicator matrix is vulnerable to noises. In this paper, we propose a novel clustering model to cope with the above-mentioned challenges. Within the proposed model, discriminative information is adaptively explored by unifying local adaptive subspace learning and KM clustering. We extend the proposed model using a robust l_{2,1} -norm loss function, where the robust cluster centroid can be calculated in a weighted iterative procedure. We also explore and discuss the relationships between the proposed algorithm and several related studies. Extensive experiments on kinds of benchmark data sets demonstrate the advantage of the proposed model compared with the state-of-the-art clustering approaches.
doi_str_mv 10.1109/TNNLS.2018.2850823
format article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmed_primary_30040663</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8415761</ieee_id><sourcerecordid>2076232981</sourcerecordid><originalsourceid>FETCH-LOGICAL-c351t-6b6ec550ef5699ee31d77e8bae0ba6a06c5433cc1b799c1ae03f4c32e44a847c3</originalsourceid><addsrcrecordid>eNpdkF1LwzAUhoMobsz9AQUpeONNZ76appdjOhXKhDnRu5Cmp9rRtTNpBf-9qZu7MDc55H3OS3gQOid4QghOblaLRfo8oZjICZURlpQdoSElgoaUSXl8mOO3ARo7t8b-CBwJnpyiAcOYYyHYEM2XTda5NrgtN1C7sqmDJeSdafupaGwwq3wKtqzfg9ey_QjSxugqmOZ625ZfEKSgbe3DM3RS6MrBeH-P0Mv8bjV7CNOn-8fZNA0Ni0gbikyAiSIMRSSSBICRPI5BZhpwpoXGwkScMWNIFieJIf6ZFdwwCpxryWPDRuh617u1zWcHrlWb0hmoKl1D0zlFcSwoo4kkHr36h66bztb-d4oSXxYTxnuK7ihjG-csFGpry42234pg1YtWv6JVL1rtRfuly311l20gP6z8afXAxQ4oAeAQS06iWBD2A2XugM8</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2184771341</pqid></control><display><type>article</type><title>Robust Dimension Reduction for Clustering With Local Adaptive Learning</title><source>IEEE Xplore (Online service)</source><creator>Wang, Xiao-Dong ; Chen, Rung-Ching ; Zeng, Zhi-Qiang ; Hong, Chao-Qun ; Yan, Fei</creator><creatorcontrib>Wang, Xiao-Dong ; Chen, Rung-Ching ; Zeng, Zhi-Qiang ; Hong, Chao-Qun ; Yan, Fei</creatorcontrib><description>In pattern recognition and data mining, clustering is a classical technique to group matters of interest and has been widely employed to numerous applications. Among various clustering algorithms, K-means (KM) clustering is most popular for its simplicity and efficiency. However, with the rapid development of the social network, high-dimensional data are frequently generated, which poses a considerable challenge to the traditional KM clustering as the curse of dimensionality. In such scenarios, it is difficult to directly cluster such high-dimensional data that always contain redundant features and noises. Although the existing approaches try to solve this problem using joint subspace learning and KM clustering, there are still the following limitations: 1) the discriminative information in low-dimensional subspace is not well captured; 2) the intrinsic geometric information is seldom considered; and 3) the optimizing procedure of a discrete cluster indicator matrix is vulnerable to noises. In this paper, we propose a novel clustering model to cope with the above-mentioned challenges. Within the proposed model, discriminative information is adaptively explored by unifying local adaptive subspace learning and KM clustering. We extend the proposed model using a robust &lt;inline-formula&gt; &lt;tex-math notation="LaTeX"&gt;l_{2,1} &lt;/tex-math&gt;&lt;/inline-formula&gt;-norm loss function, where the robust cluster centroid can be calculated in a weighted iterative procedure. We also explore and discuss the relationships between the proposed algorithm and several related studies. Extensive experiments on kinds of benchmark data sets demonstrate the advantage of the proposed model compared with the state-of-the-art clustering approaches.</description><identifier>ISSN: 2162-237X</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNNLS.2018.2850823</identifier><identifier>PMID: 30040663</identifier><identifier>CODEN: ITNNAL</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>&lt;italic xmlns:ali="http://www.niso.org/schemas/ali/1.0/" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"&gt;l ₂̦ ₁-norm ; Adaptation models ; Adaptive learning ; Algorithms ; Clustering ; Clustering algorithms ; Data mining ; Data processing ; Dimension reduction ; Dimensionality reduction ; Iterative methods ; K-means ; Learning ; Linear programming ; manifold learning ; Manifolds ; Pattern recognition ; Robustness ; Social networks ; Social organization ; State of the art ; Subspaces</subject><ispartof>IEEE transaction on neural networks and learning systems, 2019-03, Vol.30 (3), p.657-669</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c351t-6b6ec550ef5699ee31d77e8bae0ba6a06c5433cc1b799c1ae03f4c32e44a847c3</citedby><cites>FETCH-LOGICAL-c351t-6b6ec550ef5699ee31d77e8bae0ba6a06c5433cc1b799c1ae03f4c32e44a847c3</cites><orcidid>0000-0002-2417-1889 ; 0000-0001-7621-1988</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8415761$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,54796</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/30040663$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Wang, Xiao-Dong</creatorcontrib><creatorcontrib>Chen, Rung-Ching</creatorcontrib><creatorcontrib>Zeng, Zhi-Qiang</creatorcontrib><creatorcontrib>Hong, Chao-Qun</creatorcontrib><creatorcontrib>Yan, Fei</creatorcontrib><title>Robust Dimension Reduction for Clustering With Local Adaptive Learning</title><title>IEEE transaction on neural networks and learning systems</title><addtitle>TNNLS</addtitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><description>In pattern recognition and data mining, clustering is a classical technique to group matters of interest and has been widely employed to numerous applications. Among various clustering algorithms, K-means (KM) clustering is most popular for its simplicity and efficiency. However, with the rapid development of the social network, high-dimensional data are frequently generated, which poses a considerable challenge to the traditional KM clustering as the curse of dimensionality. In such scenarios, it is difficult to directly cluster such high-dimensional data that always contain redundant features and noises. Although the existing approaches try to solve this problem using joint subspace learning and KM clustering, there are still the following limitations: 1) the discriminative information in low-dimensional subspace is not well captured; 2) the intrinsic geometric information is seldom considered; and 3) the optimizing procedure of a discrete cluster indicator matrix is vulnerable to noises. In this paper, we propose a novel clustering model to cope with the above-mentioned challenges. Within the proposed model, discriminative information is adaptively explored by unifying local adaptive subspace learning and KM clustering. We extend the proposed model using a robust &lt;inline-formula&gt; &lt;tex-math notation="LaTeX"&gt;l_{2,1} &lt;/tex-math&gt;&lt;/inline-formula&gt;-norm loss function, where the robust cluster centroid can be calculated in a weighted iterative procedure. We also explore and discuss the relationships between the proposed algorithm and several related studies. Extensive experiments on kinds of benchmark data sets demonstrate the advantage of the proposed model compared with the state-of-the-art clustering approaches.</description><subject>&lt;italic xmlns:ali="http://www.niso.org/schemas/ali/1.0/" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"&gt;l ₂̦ ₁-norm</subject><subject>Adaptation models</subject><subject>Adaptive learning</subject><subject>Algorithms</subject><subject>Clustering</subject><subject>Clustering algorithms</subject><subject>Data mining</subject><subject>Data processing</subject><subject>Dimension reduction</subject><subject>Dimensionality reduction</subject><subject>Iterative methods</subject><subject>K-means</subject><subject>Learning</subject><subject>Linear programming</subject><subject>manifold learning</subject><subject>Manifolds</subject><subject>Pattern recognition</subject><subject>Robustness</subject><subject>Social networks</subject><subject>Social organization</subject><subject>State of the art</subject><subject>Subspaces</subject><issn>2162-237X</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><recordid>eNpdkF1LwzAUhoMobsz9AQUpeONNZ76appdjOhXKhDnRu5Cmp9rRtTNpBf-9qZu7MDc55H3OS3gQOid4QghOblaLRfo8oZjICZURlpQdoSElgoaUSXl8mOO3ARo7t8b-CBwJnpyiAcOYYyHYEM2XTda5NrgtN1C7sqmDJeSdafupaGwwq3wKtqzfg9ey_QjSxugqmOZ625ZfEKSgbe3DM3RS6MrBeH-P0Mv8bjV7CNOn-8fZNA0Ni0gbikyAiSIMRSSSBICRPI5BZhpwpoXGwkScMWNIFieJIf6ZFdwwCpxryWPDRuh617u1zWcHrlWb0hmoKl1D0zlFcSwoo4kkHr36h66bztb-d4oSXxYTxnuK7ihjG-csFGpry42234pg1YtWv6JVL1rtRfuly311l20gP6z8afXAxQ4oAeAQS06iWBD2A2XugM8</recordid><startdate>20190301</startdate><enddate>20190301</enddate><creator>Wang, Xiao-Dong</creator><creator>Chen, Rung-Ching</creator><creator>Zeng, Zhi-Qiang</creator><creator>Hong, Chao-Qun</creator><creator>Yan, Fei</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QP</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>P64</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-2417-1889</orcidid><orcidid>https://orcid.org/0000-0001-7621-1988</orcidid></search><sort><creationdate>20190301</creationdate><title>Robust Dimension Reduction for Clustering With Local Adaptive Learning</title><author>Wang, Xiao-Dong ; Chen, Rung-Ching ; Zeng, Zhi-Qiang ; Hong, Chao-Qun ; Yan, Fei</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c351t-6b6ec550ef5699ee31d77e8bae0ba6a06c5433cc1b799c1ae03f4c32e44a847c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>&lt;italic xmlns:ali="http://www.niso.org/schemas/ali/1.0/" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"&gt;l ₂̦ ₁-norm</topic><topic>Adaptation models</topic><topic>Adaptive learning</topic><topic>Algorithms</topic><topic>Clustering</topic><topic>Clustering algorithms</topic><topic>Data mining</topic><topic>Data processing</topic><topic>Dimension reduction</topic><topic>Dimensionality reduction</topic><topic>Iterative methods</topic><topic>K-means</topic><topic>Learning</topic><topic>Linear programming</topic><topic>manifold learning</topic><topic>Manifolds</topic><topic>Pattern recognition</topic><topic>Robustness</topic><topic>Social networks</topic><topic>Social organization</topic><topic>State of the art</topic><topic>Subspaces</topic><toplevel>online_resources</toplevel><creatorcontrib>Wang, Xiao-Dong</creatorcontrib><creatorcontrib>Chen, Rung-Ching</creatorcontrib><creatorcontrib>Zeng, Zhi-Qiang</creatorcontrib><creatorcontrib>Hong, Chao-Qun</creatorcontrib><creatorcontrib>Yan, Fei</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium &amp; Calcified Tissue Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wang, Xiao-Dong</au><au>Chen, Rung-Ching</au><au>Zeng, Zhi-Qiang</au><au>Hong, Chao-Qun</au><au>Yan, Fei</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Robust Dimension Reduction for Clustering With Local Adaptive Learning</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><stitle>TNNLS</stitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><date>2019-03-01</date><risdate>2019</risdate><volume>30</volume><issue>3</issue><spage>657</spage><epage>669</epage><pages>657-669</pages><issn>2162-237X</issn><eissn>2162-2388</eissn><coden>ITNNAL</coden><abstract>In pattern recognition and data mining, clustering is a classical technique to group matters of interest and has been widely employed to numerous applications. Among various clustering algorithms, K-means (KM) clustering is most popular for its simplicity and efficiency. However, with the rapid development of the social network, high-dimensional data are frequently generated, which poses a considerable challenge to the traditional KM clustering as the curse of dimensionality. In such scenarios, it is difficult to directly cluster such high-dimensional data that always contain redundant features and noises. Although the existing approaches try to solve this problem using joint subspace learning and KM clustering, there are still the following limitations: 1) the discriminative information in low-dimensional subspace is not well captured; 2) the intrinsic geometric information is seldom considered; and 3) the optimizing procedure of a discrete cluster indicator matrix is vulnerable to noises. In this paper, we propose a novel clustering model to cope with the above-mentioned challenges. Within the proposed model, discriminative information is adaptively explored by unifying local adaptive subspace learning and KM clustering. We extend the proposed model using a robust &lt;inline-formula&gt; &lt;tex-math notation="LaTeX"&gt;l_{2,1} &lt;/tex-math&gt;&lt;/inline-formula&gt;-norm loss function, where the robust cluster centroid can be calculated in a weighted iterative procedure. We also explore and discuss the relationships between the proposed algorithm and several related studies. Extensive experiments on kinds of benchmark data sets demonstrate the advantage of the proposed model compared with the state-of-the-art clustering approaches.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>30040663</pmid><doi>10.1109/TNNLS.2018.2850823</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0002-2417-1889</orcidid><orcidid>https://orcid.org/0000-0001-7621-1988</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 2162-237X
ispartof IEEE transaction on neural networks and learning systems, 2019-03, Vol.30 (3), p.657-669
issn 2162-237X
2162-2388
language eng
recordid cdi_pubmed_primary_30040663
source IEEE Xplore (Online service)
subjects <italic xmlns:ali="http://www.niso.org/schemas/ali/1.0/" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">l ₂̦ ₁-norm
Adaptation models
Adaptive learning
Algorithms
Clustering
Clustering algorithms
Data mining
Data processing
Dimension reduction
Dimensionality reduction
Iterative methods
K-means
Learning
Linear programming
manifold learning
Manifolds
Pattern recognition
Robustness
Social networks
Social organization
State of the art
Subspaces
title Robust Dimension Reduction for Clustering With Local Adaptive Learning
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T23%3A41%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Robust%20Dimension%20Reduction%20for%20Clustering%20With%20Local%20Adaptive%20Learning&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Wang,%20Xiao-Dong&rft.date=2019-03-01&rft.volume=30&rft.issue=3&rft.spage=657&rft.epage=669&rft.pages=657-669&rft.issn=2162-237X&rft.eissn=2162-2388&rft.coden=ITNNAL&rft_id=info:doi/10.1109/TNNLS.2018.2850823&rft_dat=%3Cproquest_pubme%3E2076232981%3C/proquest_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c351t-6b6ec550ef5699ee31d77e8bae0ba6a06c5433cc1b799c1ae03f4c32e44a847c3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2184771341&rft_id=info:pmid/30040663&rft_ieee_id=8415761&rfr_iscdi=true