Loading…
Winner Takes All: A Superpixel Aided Voting Algorithm for Training Unsupervised PolSAR CNN Classifiers
Unsupervised methods play an essential role in polarimetric synthetic aperture radar (PolSAR) image classification, where labeled data are difficult to obtain. However, there is still a large gap between existing unsupervised learning methods and supervised learning methods. Without the semantic con...
Saved in:
Published in: | IEEE transactions on geoscience and remote sensing 2022, Vol.60, p.1-19 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c223t-192d90535d77a1d5412df3841130376663b08a461eb48b451feb59efab535f243 |
---|---|
cites | cdi_FETCH-LOGICAL-c223t-192d90535d77a1d5412df3841130376663b08a461eb48b451feb59efab535f243 |
container_end_page | 19 |
container_issue | |
container_start_page | 1 |
container_title | IEEE transactions on geoscience and remote sensing |
container_volume | 60 |
creator | Zuo, Yixin Guo, Jiayi Zhang, Yueting Hu, Yuxin Lei, Bin Qiu, Xiaolan Ding, Chibiao |
description | Unsupervised methods play an essential role in polarimetric synthetic aperture radar (PolSAR) image classification, where labeled data are difficult to obtain. However, there is still a large gap between existing unsupervised learning methods and supervised learning methods. Without the semantic constraints of labeled data, pixels within the same category are often misclassified into different categories, leaving the output to be messy. To address the previous issue, this article proposes a fully unsupervised pipeline for training convolutional neural networks (CNNs). The pipeline combines low-level superpixels and high-level CNN semantic features for high-quality pseudolabel generation. It effectively eliminates the misclassified pixels by voting within the superpixel blob while preserving the sharpness of edges. With the training process of the model, the quality of the generated labels is getting improved. Experiments on airborne [experimental airborne SAR system from Germany (ESAR)/airborne synthetic aperture radar from America (AIRSAR)] and spaceborne (RadarSat2) PolSAR images prove the effectiveness of the proposed method (measured with overall accuracy (OA), average accuracy (AA), and Kappa metrics). Our method outperforms the previous unsupervised methods (H/alpha-Wishart, SM-Wishart, FDD-H, DEC, and VQC-CAE) with a large margin and even has comparable performance to the supervised CNN model [fully CNN (FCN)]. |
doi_str_mv | 10.1109/TGRS.2022.3177900 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2675043145</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9781418</ieee_id><sourcerecordid>2675043145</sourcerecordid><originalsourceid>FETCH-LOGICAL-c223t-192d90535d77a1d5412df3841130376663b08a461eb48b451feb59efab535f243</originalsourceid><addsrcrecordid>eNo9kE1LAzEQhoMoWKs_QLwEPG_NJNmPeFsWrUKp0g89hmw3qanb3ZpsRf-9WVo8Dcy8zwzzIHQNZARAxN1iPJuPKKF0xCBNBSEnaABxnEUk4fwUDQiIJKKZoOfowvsNIcBjSAfIvNum0Q4v1Kf2OK_re5zj-X6n3c7-6BrnttIVfms726zDeN06231ssWkD4pRt-vay8T3wbX2Ivrb1PJ_hYjrFRa28t8Zq5y_RmVG111fHOkTLx4dF8RRNXsbPRT6JVpSyLgJBK0FiFldpqqCKOdDKsIwDMMLSJElYSTLFE9Alz8rwgdFlLLRRZWAM5WyIbg97d6792mvfyU27d004KWmSxoSz8HdIwSG1cq33Thu5c3ar3K8EInudstcpe53yqDMwNwfGaq3_8yLNgEPG_gBVJ2-A</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2675043145</pqid></control><display><type>article</type><title>Winner Takes All: A Superpixel Aided Voting Algorithm for Training Unsupervised PolSAR CNN Classifiers</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Zuo, Yixin ; Guo, Jiayi ; Zhang, Yueting ; Hu, Yuxin ; Lei, Bin ; Qiu, Xiaolan ; Ding, Chibiao</creator><creatorcontrib>Zuo, Yixin ; Guo, Jiayi ; Zhang, Yueting ; Hu, Yuxin ; Lei, Bin ; Qiu, Xiaolan ; Ding, Chibiao</creatorcontrib><description>Unsupervised methods play an essential role in polarimetric synthetic aperture radar (PolSAR) image classification, where labeled data are difficult to obtain. However, there is still a large gap between existing unsupervised learning methods and supervised learning methods. Without the semantic constraints of labeled data, pixels within the same category are often misclassified into different categories, leaving the output to be messy. To address the previous issue, this article proposes a fully unsupervised pipeline for training convolutional neural networks (CNNs). The pipeline combines low-level superpixels and high-level CNN semantic features for high-quality pseudolabel generation. It effectively eliminates the misclassified pixels by voting within the superpixel blob while preserving the sharpness of edges. With the training process of the model, the quality of the generated labels is getting improved. Experiments on airborne [experimental airborne SAR system from Germany (ESAR)/airborne synthetic aperture radar from America (AIRSAR)] and spaceborne (RadarSat2) PolSAR images prove the effectiveness of the proposed method (measured with overall accuracy (OA), average accuracy (AA), and Kappa metrics). Our method outperforms the previous unsupervised methods (H/alpha-Wishart, SM-Wishart, FDD-H, DEC, and VQC-CAE) with a large margin and even has comparable performance to the supervised CNN model [fully CNN (FCN)].</description><identifier>ISSN: 0196-2892</identifier><identifier>EISSN: 1558-0644</identifier><identifier>DOI: 10.1109/TGRS.2022.3177900</identifier><identifier>CODEN: IGRSD2</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Accuracy ; Airborne radar ; Airborne remote sensing ; Algorithms ; Artificial neural networks ; Convolutional neural network (CNN) model ; Feature extraction ; Image classification ; Methods ; multiscale (MS) semantic feature ; Neural networks ; Pixels ; polarimetric synthetic aperture radar (PolSAR) image ; Radar ; Radar imaging ; SAR (radar) ; Scattering ; Semantics ; Submarine pipelines ; superpixel segmentation ; Supervised learning ; Synthetic aperture radar ; Teaching methods ; Training ; unsupervised classification ; Unsupervised learning ; Voting</subject><ispartof>IEEE transactions on geoscience and remote sensing, 2022, Vol.60, p.1-19</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c223t-192d90535d77a1d5412df3841130376663b08a461eb48b451feb59efab535f243</citedby><cites>FETCH-LOGICAL-c223t-192d90535d77a1d5412df3841130376663b08a461eb48b451feb59efab535f243</cites><orcidid>0000-0002-8517-3415 ; 0000-0002-3052-4222</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9781418$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,4024,27923,27924,27925,54796</link.rule.ids></links><search><creatorcontrib>Zuo, Yixin</creatorcontrib><creatorcontrib>Guo, Jiayi</creatorcontrib><creatorcontrib>Zhang, Yueting</creatorcontrib><creatorcontrib>Hu, Yuxin</creatorcontrib><creatorcontrib>Lei, Bin</creatorcontrib><creatorcontrib>Qiu, Xiaolan</creatorcontrib><creatorcontrib>Ding, Chibiao</creatorcontrib><title>Winner Takes All: A Superpixel Aided Voting Algorithm for Training Unsupervised PolSAR CNN Classifiers</title><title>IEEE transactions on geoscience and remote sensing</title><addtitle>TGRS</addtitle><description>Unsupervised methods play an essential role in polarimetric synthetic aperture radar (PolSAR) image classification, where labeled data are difficult to obtain. However, there is still a large gap between existing unsupervised learning methods and supervised learning methods. Without the semantic constraints of labeled data, pixels within the same category are often misclassified into different categories, leaving the output to be messy. To address the previous issue, this article proposes a fully unsupervised pipeline for training convolutional neural networks (CNNs). The pipeline combines low-level superpixels and high-level CNN semantic features for high-quality pseudolabel generation. It effectively eliminates the misclassified pixels by voting within the superpixel blob while preserving the sharpness of edges. With the training process of the model, the quality of the generated labels is getting improved. Experiments on airborne [experimental airborne SAR system from Germany (ESAR)/airborne synthetic aperture radar from America (AIRSAR)] and spaceborne (RadarSat2) PolSAR images prove the effectiveness of the proposed method (measured with overall accuracy (OA), average accuracy (AA), and Kappa metrics). Our method outperforms the previous unsupervised methods (H/alpha-Wishart, SM-Wishart, FDD-H, DEC, and VQC-CAE) with a large margin and even has comparable performance to the supervised CNN model [fully CNN (FCN)].</description><subject>Accuracy</subject><subject>Airborne radar</subject><subject>Airborne remote sensing</subject><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Convolutional neural network (CNN) model</subject><subject>Feature extraction</subject><subject>Image classification</subject><subject>Methods</subject><subject>multiscale (MS) semantic feature</subject><subject>Neural networks</subject><subject>Pixels</subject><subject>polarimetric synthetic aperture radar (PolSAR) image</subject><subject>Radar</subject><subject>Radar imaging</subject><subject>SAR (radar)</subject><subject>Scattering</subject><subject>Semantics</subject><subject>Submarine pipelines</subject><subject>superpixel segmentation</subject><subject>Supervised learning</subject><subject>Synthetic aperture radar</subject><subject>Teaching methods</subject><subject>Training</subject><subject>unsupervised classification</subject><subject>Unsupervised learning</subject><subject>Voting</subject><issn>0196-2892</issn><issn>1558-0644</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNo9kE1LAzEQhoMoWKs_QLwEPG_NJNmPeFsWrUKp0g89hmw3qanb3ZpsRf-9WVo8Dcy8zwzzIHQNZARAxN1iPJuPKKF0xCBNBSEnaABxnEUk4fwUDQiIJKKZoOfowvsNIcBjSAfIvNum0Q4v1Kf2OK_re5zj-X6n3c7-6BrnttIVfms726zDeN06231ssWkD4pRt-vay8T3wbX2Ivrb1PJ_hYjrFRa28t8Zq5y_RmVG111fHOkTLx4dF8RRNXsbPRT6JVpSyLgJBK0FiFldpqqCKOdDKsIwDMMLSJElYSTLFE9Alz8rwgdFlLLRRZWAM5WyIbg97d6792mvfyU27d004KWmSxoSz8HdIwSG1cq33Thu5c3ar3K8EInudstcpe53yqDMwNwfGaq3_8yLNgEPG_gBVJ2-A</recordid><startdate>2022</startdate><enddate>2022</enddate><creator>Zuo, Yixin</creator><creator>Guo, Jiayi</creator><creator>Zhang, Yueting</creator><creator>Hu, Yuxin</creator><creator>Lei, Bin</creator><creator>Qiu, Xiaolan</creator><creator>Ding, Chibiao</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7UA</scope><scope>8FD</scope><scope>C1K</scope><scope>F1W</scope><scope>FR3</scope><scope>H8D</scope><scope>H96</scope><scope>KR7</scope><scope>L.G</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0002-8517-3415</orcidid><orcidid>https://orcid.org/0000-0002-3052-4222</orcidid></search><sort><creationdate>2022</creationdate><title>Winner Takes All: A Superpixel Aided Voting Algorithm for Training Unsupervised PolSAR CNN Classifiers</title><author>Zuo, Yixin ; Guo, Jiayi ; Zhang, Yueting ; Hu, Yuxin ; Lei, Bin ; Qiu, Xiaolan ; Ding, Chibiao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c223t-192d90535d77a1d5412df3841130376663b08a461eb48b451feb59efab535f243</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Accuracy</topic><topic>Airborne radar</topic><topic>Airborne remote sensing</topic><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Convolutional neural network (CNN) model</topic><topic>Feature extraction</topic><topic>Image classification</topic><topic>Methods</topic><topic>multiscale (MS) semantic feature</topic><topic>Neural networks</topic><topic>Pixels</topic><topic>polarimetric synthetic aperture radar (PolSAR) image</topic><topic>Radar</topic><topic>Radar imaging</topic><topic>SAR (radar)</topic><topic>Scattering</topic><topic>Semantics</topic><topic>Submarine pipelines</topic><topic>superpixel segmentation</topic><topic>Supervised learning</topic><topic>Synthetic aperture radar</topic><topic>Teaching methods</topic><topic>Training</topic><topic>unsupervised classification</topic><topic>Unsupervised learning</topic><topic>Voting</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zuo, Yixin</creatorcontrib><creatorcontrib>Guo, Jiayi</creatorcontrib><creatorcontrib>Zhang, Yueting</creatorcontrib><creatorcontrib>Hu, Yuxin</creatorcontrib><creatorcontrib>Lei, Bin</creatorcontrib><creatorcontrib>Qiu, Xiaolan</creatorcontrib><creatorcontrib>Ding, Chibiao</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE</collection><collection>CrossRef</collection><collection>Water Resources Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources</collection><collection>Civil Engineering Abstracts</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) Professional</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE transactions on geoscience and remote sensing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zuo, Yixin</au><au>Guo, Jiayi</au><au>Zhang, Yueting</au><au>Hu, Yuxin</au><au>Lei, Bin</au><au>Qiu, Xiaolan</au><au>Ding, Chibiao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Winner Takes All: A Superpixel Aided Voting Algorithm for Training Unsupervised PolSAR CNN Classifiers</atitle><jtitle>IEEE transactions on geoscience and remote sensing</jtitle><stitle>TGRS</stitle><date>2022</date><risdate>2022</risdate><volume>60</volume><spage>1</spage><epage>19</epage><pages>1-19</pages><issn>0196-2892</issn><eissn>1558-0644</eissn><coden>IGRSD2</coden><abstract>Unsupervised methods play an essential role in polarimetric synthetic aperture radar (PolSAR) image classification, where labeled data are difficult to obtain. However, there is still a large gap between existing unsupervised learning methods and supervised learning methods. Without the semantic constraints of labeled data, pixels within the same category are often misclassified into different categories, leaving the output to be messy. To address the previous issue, this article proposes a fully unsupervised pipeline for training convolutional neural networks (CNNs). The pipeline combines low-level superpixels and high-level CNN semantic features for high-quality pseudolabel generation. It effectively eliminates the misclassified pixels by voting within the superpixel blob while preserving the sharpness of edges. With the training process of the model, the quality of the generated labels is getting improved. Experiments on airborne [experimental airborne SAR system from Germany (ESAR)/airborne synthetic aperture radar from America (AIRSAR)] and spaceborne (RadarSat2) PolSAR images prove the effectiveness of the proposed method (measured with overall accuracy (OA), average accuracy (AA), and Kappa metrics). Our method outperforms the previous unsupervised methods (H/alpha-Wishart, SM-Wishart, FDD-H, DEC, and VQC-CAE) with a large margin and even has comparable performance to the supervised CNN model [fully CNN (FCN)].</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TGRS.2022.3177900</doi><tpages>19</tpages><orcidid>https://orcid.org/0000-0002-8517-3415</orcidid><orcidid>https://orcid.org/0000-0002-3052-4222</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0196-2892 |
ispartof | IEEE transactions on geoscience and remote sensing, 2022, Vol.60, p.1-19 |
issn | 0196-2892 1558-0644 |
language | eng |
recordid | cdi_proquest_journals_2675043145 |
source | IEEE Electronic Library (IEL) Journals |
subjects | Accuracy Airborne radar Airborne remote sensing Algorithms Artificial neural networks Convolutional neural network (CNN) model Feature extraction Image classification Methods multiscale (MS) semantic feature Neural networks Pixels polarimetric synthetic aperture radar (PolSAR) image Radar Radar imaging SAR (radar) Scattering Semantics Submarine pipelines superpixel segmentation Supervised learning Synthetic aperture radar Teaching methods Training unsupervised classification Unsupervised learning Voting |
title | Winner Takes All: A Superpixel Aided Voting Algorithm for Training Unsupervised PolSAR CNN Classifiers |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T04%3A29%3A08IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Winner%20Takes%20All:%20A%20Superpixel%20Aided%20Voting%20Algorithm%20for%20Training%20Unsupervised%20PolSAR%20CNN%20Classifiers&rft.jtitle=IEEE%20transactions%20on%20geoscience%20and%20remote%20sensing&rft.au=Zuo,%20Yixin&rft.date=2022&rft.volume=60&rft.spage=1&rft.epage=19&rft.pages=1-19&rft.issn=0196-2892&rft.eissn=1558-0644&rft.coden=IGRSD2&rft_id=info:doi/10.1109/TGRS.2022.3177900&rft_dat=%3Cproquest_cross%3E2675043145%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c223t-192d90535d77a1d5412df3841130376663b08a461eb48b451feb59efab535f243%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2675043145&rft_id=info:pmid/&rft_ieee_id=9781418&rfr_iscdi=true |