Loading…
Feature Intensification Using Perception-Guided Regional Classification for Remote Sensing Image Super-Resolution
In recent years, super-resolution technology has gained widespread attention in the field of remote sensing. Despite advancements, current methods often employ uniform reconstruction techniques across entire remote sensing images, neglecting the inherent variability in spatial frequency distribution...
Saved in:
Published in: | Remote sensing (Basel, Switzerland) Switzerland), 2024-11, Vol.16 (22), p.4201 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | cdi_FETCH-LOGICAL-c289t-ebb0353dd6328e30079f88d73a1831071fedf210058c10afe657362b1a9a6e073 |
container_end_page | |
container_issue | 22 |
container_start_page | 4201 |
container_title | Remote sensing (Basel, Switzerland) |
container_volume | 16 |
creator | Li, Yinghua Xie, Jingyi Chi, Kaichen Zhang, Ying Dong, Yunyun |
description | In recent years, super-resolution technology has gained widespread attention in the field of remote sensing. Despite advancements, current methods often employ uniform reconstruction techniques across entire remote sensing images, neglecting the inherent variability in spatial frequency distributions, particularly the distinction between high-frequency texture regions and smoother areas, leading to computational inefficiency, which introduces redundant computations and fails to optimize the reconstruction process for regions of higher complexity. To address these issues, we propose the Perception-guided Classification Feature Intensification (PCFI) network. PCFI integrates two key components: a compressed sensing classifier that optimizes speed and performance, and a deep texture interaction fusion module that enhances content interaction and detail extraction. This network mitigates the tendency of Transformers to favor global information over local details, achieving improved image information integration through residual connections across windows. Furthermore, a classifier is employed to segment sub-image blocks prior to super-resolution, enabling efficient large-scale processing. The experimental results on the AID dataset indicate that PCFI achieves state-of-the-art performance, with a PSNR of 30.87 dB and an SSIM of 0.8131, while also delivering a 4.33% improvement in processing speed compared to the second-best method. |
doi_str_mv | 10.3390/rs16224201 |
format | article |
fullrecord | <record><control><sourceid>gale_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_6845535d518348c2a37a3654913865b2</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A818469640</galeid><doaj_id>oai_doaj_org_article_6845535d518348c2a37a3654913865b2</doaj_id><sourcerecordid>A818469640</sourcerecordid><originalsourceid>FETCH-LOGICAL-c289t-ebb0353dd6328e30079f88d73a1831071fedf210058c10afe657362b1a9a6e073</originalsourceid><addsrcrecordid>eNpNkUFr3DAQhU1poSHJpb_A0FvBqaSRZPkYliZZCDSkzVnMSiOjxWttJPvQf19tt7SRDpp5eu9DaJrmE2c3AAP7mgvXQkjB-LvmQrBedFIM4v2b-mNzXcqe1QXAByYvmtc7wmXN1G7nheYSQ3S4xDS3LyXOY_tE2dHxJHT3a_Tk22caa4dTu5mwvPGHlOvdIS3U_jiBanh7wLF265Fy90wlTevJedV8CDgVuv57XjYvd99-bh66x-_3283tY-eEGZaOdjsGCrzXIAwBY_0QjPE9IDfAWc8D-SA4Y8o4zjCQVj1oseM4oCbWw2WzPXN9wr095njA_MsmjPaPkPJoMS_RTWS1kUqB8qqipXECoUfQSg4cjFY7UVmfz6xjTq8rlcXu05rrLxQLHKC6pFTVdXN2jVihcQ5pyejq9nSILs0UYtVvDTdSD1qyGvhyDricSskU_j2TM3saqf0_UvgNLnKRwQ</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3133386445</pqid></control><display><type>article</type><title>Feature Intensification Using Perception-Guided Regional Classification for Remote Sensing Image Super-Resolution</title><source>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</source><creator>Li, Yinghua ; Xie, Jingyi ; Chi, Kaichen ; Zhang, Ying ; Dong, Yunyun</creator><creatorcontrib>Li, Yinghua ; Xie, Jingyi ; Chi, Kaichen ; Zhang, Ying ; Dong, Yunyun</creatorcontrib><description>In recent years, super-resolution technology has gained widespread attention in the field of remote sensing. Despite advancements, current methods often employ uniform reconstruction techniques across entire remote sensing images, neglecting the inherent variability in spatial frequency distributions, particularly the distinction between high-frequency texture regions and smoother areas, leading to computational inefficiency, which introduces redundant computations and fails to optimize the reconstruction process for regions of higher complexity. To address these issues, we propose the Perception-guided Classification Feature Intensification (PCFI) network. PCFI integrates two key components: a compressed sensing classifier that optimizes speed and performance, and a deep texture interaction fusion module that enhances content interaction and detail extraction. This network mitigates the tendency of Transformers to favor global information over local details, achieving improved image information integration through residual connections across windows. Furthermore, a classifier is employed to segment sub-image blocks prior to super-resolution, enabling efficient large-scale processing. The experimental results on the AID dataset indicate that PCFI achieves state-of-the-art performance, with a PSNR of 30.87 dB and an SSIM of 0.8131, while also delivering a 4.33% improvement in processing speed compared to the second-best method.</description><identifier>ISSN: 2072-4292</identifier><identifier>EISSN: 2072-4292</identifier><identifier>DOI: 10.3390/rs16224201</identifier><language>eng</language><publisher>Basel: MDPI AG</publisher><subject>Accuracy ; Classification ; compressed sensing ; Deep learning ; Frequency dependence ; Image reconstruction ; Image resolution ; Perception ; Processing speed ; Remote sensing ; remote sensing images ; Satellites ; super-resolution ; Texture</subject><ispartof>Remote sensing (Basel, Switzerland), 2024-11, Vol.16 (22), p.4201</ispartof><rights>COPYRIGHT 2024 MDPI AG</rights><rights>2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c289t-ebb0353dd6328e30079f88d73a1831071fedf210058c10afe657362b1a9a6e073</cites><orcidid>0009-0008-5705-6992 ; 0000-0002-1366-3503 ; 0000-0002-0241-7091 ; 0000-0001-8375-442X ; 0009-0003-2310-7855</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/3133386445/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/3133386445?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,25753,27924,27925,37012,44590,75126</link.rule.ids></links><search><creatorcontrib>Li, Yinghua</creatorcontrib><creatorcontrib>Xie, Jingyi</creatorcontrib><creatorcontrib>Chi, Kaichen</creatorcontrib><creatorcontrib>Zhang, Ying</creatorcontrib><creatorcontrib>Dong, Yunyun</creatorcontrib><title>Feature Intensification Using Perception-Guided Regional Classification for Remote Sensing Image Super-Resolution</title><title>Remote sensing (Basel, Switzerland)</title><description>In recent years, super-resolution technology has gained widespread attention in the field of remote sensing. Despite advancements, current methods often employ uniform reconstruction techniques across entire remote sensing images, neglecting the inherent variability in spatial frequency distributions, particularly the distinction between high-frequency texture regions and smoother areas, leading to computational inefficiency, which introduces redundant computations and fails to optimize the reconstruction process for regions of higher complexity. To address these issues, we propose the Perception-guided Classification Feature Intensification (PCFI) network. PCFI integrates two key components: a compressed sensing classifier that optimizes speed and performance, and a deep texture interaction fusion module that enhances content interaction and detail extraction. This network mitigates the tendency of Transformers to favor global information over local details, achieving improved image information integration through residual connections across windows. Furthermore, a classifier is employed to segment sub-image blocks prior to super-resolution, enabling efficient large-scale processing. The experimental results on the AID dataset indicate that PCFI achieves state-of-the-art performance, with a PSNR of 30.87 dB and an SSIM of 0.8131, while also delivering a 4.33% improvement in processing speed compared to the second-best method.</description><subject>Accuracy</subject><subject>Classification</subject><subject>compressed sensing</subject><subject>Deep learning</subject><subject>Frequency dependence</subject><subject>Image reconstruction</subject><subject>Image resolution</subject><subject>Perception</subject><subject>Processing speed</subject><subject>Remote sensing</subject><subject>remote sensing images</subject><subject>Satellites</subject><subject>super-resolution</subject><subject>Texture</subject><issn>2072-4292</issn><issn>2072-4292</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNpNkUFr3DAQhU1poSHJpb_A0FvBqaSRZPkYliZZCDSkzVnMSiOjxWttJPvQf19tt7SRDpp5eu9DaJrmE2c3AAP7mgvXQkjB-LvmQrBedFIM4v2b-mNzXcqe1QXAByYvmtc7wmXN1G7nheYSQ3S4xDS3LyXOY_tE2dHxJHT3a_Tk22caa4dTu5mwvPGHlOvdIS3U_jiBanh7wLF265Fy90wlTevJedV8CDgVuv57XjYvd99-bh66x-_3283tY-eEGZaOdjsGCrzXIAwBY_0QjPE9IDfAWc8D-SA4Y8o4zjCQVj1oseM4oCbWw2WzPXN9wr095njA_MsmjPaPkPJoMS_RTWS1kUqB8qqipXECoUfQSg4cjFY7UVmfz6xjTq8rlcXu05rrLxQLHKC6pFTVdXN2jVihcQ5pyejq9nSILs0UYtVvDTdSD1qyGvhyDricSskU_j2TM3saqf0_UvgNLnKRwQ</recordid><startdate>20241101</startdate><enddate>20241101</enddate><creator>Li, Yinghua</creator><creator>Xie, Jingyi</creator><creator>Chi, Kaichen</creator><creator>Zhang, Ying</creator><creator>Dong, Yunyun</creator><general>MDPI AG</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SN</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>BKSAR</scope><scope>C1K</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>H8G</scope><scope>HCIFZ</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PCBAR</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>DOA</scope><orcidid>https://orcid.org/0009-0008-5705-6992</orcidid><orcidid>https://orcid.org/0000-0002-1366-3503</orcidid><orcidid>https://orcid.org/0000-0002-0241-7091</orcidid><orcidid>https://orcid.org/0000-0001-8375-442X</orcidid><orcidid>https://orcid.org/0009-0003-2310-7855</orcidid></search><sort><creationdate>20241101</creationdate><title>Feature Intensification Using Perception-Guided Regional Classification for Remote Sensing Image Super-Resolution</title><author>Li, Yinghua ; Xie, Jingyi ; Chi, Kaichen ; Zhang, Ying ; Dong, Yunyun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c289t-ebb0353dd6328e30079f88d73a1831071fedf210058c10afe657362b1a9a6e073</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Classification</topic><topic>compressed sensing</topic><topic>Deep learning</topic><topic>Frequency dependence</topic><topic>Image reconstruction</topic><topic>Image resolution</topic><topic>Perception</topic><topic>Processing speed</topic><topic>Remote sensing</topic><topic>remote sensing images</topic><topic>Satellites</topic><topic>super-resolution</topic><topic>Texture</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Yinghua</creatorcontrib><creatorcontrib>Xie, Jingyi</creatorcontrib><creatorcontrib>Chi, Kaichen</creatorcontrib><creatorcontrib>Zhang, Ying</creatorcontrib><creatorcontrib>Dong, Yunyun</creatorcontrib><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Ecology Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Database (1962 - current)</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Earth, Atmospheric & Aquatic Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Copper Technical Reference Library</collection><collection>SciTech Premium Collection</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Engineering Database</collection><collection>ProQuest advanced technologies & aerospace journals</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Earth, Atmospheric & Aquatic Science Database</collection><collection>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection><collection>Directory of Open Access Journals</collection><jtitle>Remote sensing (Basel, Switzerland)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Yinghua</au><au>Xie, Jingyi</au><au>Chi, Kaichen</au><au>Zhang, Ying</au><au>Dong, Yunyun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Feature Intensification Using Perception-Guided Regional Classification for Remote Sensing Image Super-Resolution</atitle><jtitle>Remote sensing (Basel, Switzerland)</jtitle><date>2024-11-01</date><risdate>2024</risdate><volume>16</volume><issue>22</issue><spage>4201</spage><pages>4201-</pages><issn>2072-4292</issn><eissn>2072-4292</eissn><abstract>In recent years, super-resolution technology has gained widespread attention in the field of remote sensing. Despite advancements, current methods often employ uniform reconstruction techniques across entire remote sensing images, neglecting the inherent variability in spatial frequency distributions, particularly the distinction between high-frequency texture regions and smoother areas, leading to computational inefficiency, which introduces redundant computations and fails to optimize the reconstruction process for regions of higher complexity. To address these issues, we propose the Perception-guided Classification Feature Intensification (PCFI) network. PCFI integrates two key components: a compressed sensing classifier that optimizes speed and performance, and a deep texture interaction fusion module that enhances content interaction and detail extraction. This network mitigates the tendency of Transformers to favor global information over local details, achieving improved image information integration through residual connections across windows. Furthermore, a classifier is employed to segment sub-image blocks prior to super-resolution, enabling efficient large-scale processing. The experimental results on the AID dataset indicate that PCFI achieves state-of-the-art performance, with a PSNR of 30.87 dB and an SSIM of 0.8131, while also delivering a 4.33% improvement in processing speed compared to the second-best method.</abstract><cop>Basel</cop><pub>MDPI AG</pub><doi>10.3390/rs16224201</doi><orcidid>https://orcid.org/0009-0008-5705-6992</orcidid><orcidid>https://orcid.org/0000-0002-1366-3503</orcidid><orcidid>https://orcid.org/0000-0002-0241-7091</orcidid><orcidid>https://orcid.org/0000-0001-8375-442X</orcidid><orcidid>https://orcid.org/0009-0003-2310-7855</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2072-4292 |
ispartof | Remote sensing (Basel, Switzerland), 2024-11, Vol.16 (22), p.4201 |
issn | 2072-4292 2072-4292 |
language | eng |
recordid | cdi_doaj_primary_oai_doaj_org_article_6845535d518348c2a37a3654913865b2 |
source | Publicly Available Content Database (Proquest) (PQ_SDU_P3) |
subjects | Accuracy Classification compressed sensing Deep learning Frequency dependence Image reconstruction Image resolution Perception Processing speed Remote sensing remote sensing images Satellites super-resolution Texture |
title | Feature Intensification Using Perception-Guided Regional Classification for Remote Sensing Image Super-Resolution |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T21%3A43%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Feature%20Intensification%20Using%20Perception-Guided%20Regional%20Classification%20for%20Remote%20Sensing%20Image%20Super-Resolution&rft.jtitle=Remote%20sensing%20(Basel,%20Switzerland)&rft.au=Li,%20Yinghua&rft.date=2024-11-01&rft.volume=16&rft.issue=22&rft.spage=4201&rft.pages=4201-&rft.issn=2072-4292&rft.eissn=2072-4292&rft_id=info:doi/10.3390/rs16224201&rft_dat=%3Cgale_doaj_%3EA818469640%3C/gale_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c289t-ebb0353dd6328e30079f88d73a1831071fedf210058c10afe657362b1a9a6e073%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3133386445&rft_id=info:pmid/&rft_galeid=A818469640&rfr_iscdi=true |