Loading…

Multiscale Context Deep Hashing for Remote Sensing Image Retrieval

With the advancement of remote sensing satellites and sensor technology, the quantity and diversity of remote sensing imagery have exhibited a sustained trend of growth. Remote sensing image retrieval has gained significant attention in the realm of remote sensing. Hashing methods have been widely a...

Full description

Saved in:
Bibliographic Details
Published in:IEEE journal of selected topics in applied earth observations and remote sensing 2023, Vol.16, p.7163-7172
Main Authors: Zhao, Dongjie, Chen, Yaxiong, Xiong, Shengwu
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c409t-bb9a51c94d2390d9daef36d91f10f42c406fbcada652bbc28a36f5ae5aebc3e13
cites cdi_FETCH-LOGICAL-c409t-bb9a51c94d2390d9daef36d91f10f42c406fbcada652bbc28a36f5ae5aebc3e13
container_end_page 7172
container_issue
container_start_page 7163
container_title IEEE journal of selected topics in applied earth observations and remote sensing
container_volume 16
creator Zhao, Dongjie
Chen, Yaxiong
Xiong, Shengwu
description With the advancement of remote sensing satellites and sensor technology, the quantity and diversity of remote sensing imagery have exhibited a sustained trend of growth. Remote sensing image retrieval has gained significant attention in the realm of remote sensing. Hashing methods have been widely applied in remote sensing image retrieval due to their high computational efficiency, low storage cost, and effective performance. However, existing remote sensing image retrieval methods often struggle to accurately capture the intricate information of remote sensing images. They often lack high attention to key features. The neglect of multiscale and saliency information in remote sensing images can result in feature loss and difficulties in maintaining the balance of hash codes. In response to the issues, we introduce a multiscale context deep hashing network (MSCDH). First, we can obtain finer-grained multi-scale features and achieve a larger receptive field by incorporating the proposed multiscale residual blocks. Then, the proposed multicontext attention modules can increase the perceptual field and suppress the interference from irrelevant information by aggregating contextual information along channels and spatial dimensions. The experimental results on the UCMerced dataset and WHU-RS dataset demonstrate that the proposed method achieves state-of-the-art retrieval performance.
doi_str_mv 10.1109/JSTARS.2023.3298990
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1109_JSTARS_2023_3298990</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10195178</ieee_id><doaj_id>oai_doaj_org_article_2b410558810e442c95980b56f5a555ba</doaj_id><sourcerecordid>2849104293</sourcerecordid><originalsourceid>FETCH-LOGICAL-c409t-bb9a51c94d2390d9daef36d91f10f42c406fbcada652bbc28a36f5ae5aebc3e13</originalsourceid><addsrcrecordid>eNpNkFtLw0AQhRdRsF5-gT4EfE7d2UuaedR6aUURbH1edpNJTUm7dTcV_femRkQYGDic881wGDsDPgTgePkwm1-9zIaCCzmUAnNEvscGAjSkoKXeZwNAiSkorg7ZUYxLzjMxQjlg10_bpq1jYRtKxn7d0meb3BBtkomNb_V6kVQ-JC-08i0lM1rHnTRd2QV1Yhtq-rDNCTuobBPp9Hcfs9e72_l4kj4-30_HV49poTi2qXNoNRSoSiGRl1haqmRWIlTAKyU6U1a5wpY208K5QuRWZpW21I0rJIE8ZtOeW3q7NJtQr2z4Mt7W5kfwYWFsaOuiISOcAq51ngMn1bFRY86d3vG01s52rIuetQn-fUuxNUu_DevufSNyhcCVQNm5ZO8qgo8xUPV3FbjZFW_64s2uePNbfJc671M1Ef1LAGoY5fIb_nR-yw</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2849104293</pqid></control><display><type>article</type><title>Multiscale Context Deep Hashing for Remote Sensing Image Retrieval</title><source>Alma/SFX Local Collection</source><creator>Zhao, Dongjie ; Chen, Yaxiong ; Xiong, Shengwu</creator><creatorcontrib>Zhao, Dongjie ; Chen, Yaxiong ; Xiong, Shengwu</creatorcontrib><description>With the advancement of remote sensing satellites and sensor technology, the quantity and diversity of remote sensing imagery have exhibited a sustained trend of growth. Remote sensing image retrieval has gained significant attention in the realm of remote sensing. Hashing methods have been widely applied in remote sensing image retrieval due to their high computational efficiency, low storage cost, and effective performance. However, existing remote sensing image retrieval methods often struggle to accurately capture the intricate information of remote sensing images. They often lack high attention to key features. The neglect of multiscale and saliency information in remote sensing images can result in feature loss and difficulties in maintaining the balance of hash codes. In response to the issues, we introduce a multiscale context deep hashing network (MSCDH). First, we can obtain finer-grained multi-scale features and achieve a larger receptive field by incorporating the proposed multiscale residual blocks. Then, the proposed multicontext attention modules can increase the perceptual field and suppress the interference from irrelevant information by aggregating contextual information along channels and spatial dimensions. The experimental results on the UCMerced dataset and WHU-RS dataset demonstrate that the proposed method achieves state-of-the-art retrieval performance.</description><identifier>ISSN: 1939-1404</identifier><identifier>EISSN: 2151-1535</identifier><identifier>DOI: 10.1109/JSTARS.2023.3298990</identifier><identifier>CODEN: IJSTHZ</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Attention mechanism ; Context ; Datasets ; deep hash ; Feature extraction ; Image retrieval ; multiscale context information ; Receptive field ; Remote sensing ; Remote sensors ; Semantics ; Sensors ; Task analysis ; Visualization</subject><ispartof>IEEE journal of selected topics in applied earth observations and remote sensing, 2023, Vol.16, p.7163-7172</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c409t-bb9a51c94d2390d9daef36d91f10f42c406fbcada652bbc28a36f5ae5aebc3e13</citedby><cites>FETCH-LOGICAL-c409t-bb9a51c94d2390d9daef36d91f10f42c406fbcada652bbc28a36f5ae5aebc3e13</cites><orcidid>0000-0002-2903-6723 ; 0000-0002-4006-7029 ; 0000-0002-9750-8032</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,4024,27923,27924,27925</link.rule.ids></links><search><creatorcontrib>Zhao, Dongjie</creatorcontrib><creatorcontrib>Chen, Yaxiong</creatorcontrib><creatorcontrib>Xiong, Shengwu</creatorcontrib><title>Multiscale Context Deep Hashing for Remote Sensing Image Retrieval</title><title>IEEE journal of selected topics in applied earth observations and remote sensing</title><addtitle>JSTARS</addtitle><description>With the advancement of remote sensing satellites and sensor technology, the quantity and diversity of remote sensing imagery have exhibited a sustained trend of growth. Remote sensing image retrieval has gained significant attention in the realm of remote sensing. Hashing methods have been widely applied in remote sensing image retrieval due to their high computational efficiency, low storage cost, and effective performance. However, existing remote sensing image retrieval methods often struggle to accurately capture the intricate information of remote sensing images. They often lack high attention to key features. The neglect of multiscale and saliency information in remote sensing images can result in feature loss and difficulties in maintaining the balance of hash codes. In response to the issues, we introduce a multiscale context deep hashing network (MSCDH). First, we can obtain finer-grained multi-scale features and achieve a larger receptive field by incorporating the proposed multiscale residual blocks. Then, the proposed multicontext attention modules can increase the perceptual field and suppress the interference from irrelevant information by aggregating contextual information along channels and spatial dimensions. The experimental results on the UCMerced dataset and WHU-RS dataset demonstrate that the proposed method achieves state-of-the-art retrieval performance.</description><subject>Attention mechanism</subject><subject>Context</subject><subject>Datasets</subject><subject>deep hash</subject><subject>Feature extraction</subject><subject>Image retrieval</subject><subject>multiscale context information</subject><subject>Receptive field</subject><subject>Remote sensing</subject><subject>Remote sensors</subject><subject>Semantics</subject><subject>Sensors</subject><subject>Task analysis</subject><subject>Visualization</subject><issn>1939-1404</issn><issn>2151-1535</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>DOA</sourceid><recordid>eNpNkFtLw0AQhRdRsF5-gT4EfE7d2UuaedR6aUURbH1edpNJTUm7dTcV_femRkQYGDic881wGDsDPgTgePkwm1-9zIaCCzmUAnNEvscGAjSkoKXeZwNAiSkorg7ZUYxLzjMxQjlg10_bpq1jYRtKxn7d0meb3BBtkomNb_V6kVQ-JC-08i0lM1rHnTRd2QV1Yhtq-rDNCTuobBPp9Hcfs9e72_l4kj4-30_HV49poTi2qXNoNRSoSiGRl1haqmRWIlTAKyU6U1a5wpY208K5QuRWZpW21I0rJIE8ZtOeW3q7NJtQr2z4Mt7W5kfwYWFsaOuiISOcAq51ngMn1bFRY86d3vG01s52rIuetQn-fUuxNUu_DevufSNyhcCVQNm5ZO8qgo8xUPV3FbjZFW_64s2uePNbfJc671M1Ef1LAGoY5fIb_nR-yw</recordid><startdate>2023</startdate><enddate>2023</enddate><creator>Zhao, Dongjie</creator><creator>Chen, Yaxiong</creator><creator>Xiong, Shengwu</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7UA</scope><scope>8FD</scope><scope>C1K</scope><scope>F1W</scope><scope>FR3</scope><scope>H8D</scope><scope>H96</scope><scope>KR7</scope><scope>L.G</scope><scope>L7M</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-2903-6723</orcidid><orcidid>https://orcid.org/0000-0002-4006-7029</orcidid><orcidid>https://orcid.org/0000-0002-9750-8032</orcidid></search><sort><creationdate>2023</creationdate><title>Multiscale Context Deep Hashing for Remote Sensing Image Retrieval</title><author>Zhao, Dongjie ; Chen, Yaxiong ; Xiong, Shengwu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c409t-bb9a51c94d2390d9daef36d91f10f42c406fbcada652bbc28a36f5ae5aebc3e13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Attention mechanism</topic><topic>Context</topic><topic>Datasets</topic><topic>deep hash</topic><topic>Feature extraction</topic><topic>Image retrieval</topic><topic>multiscale context information</topic><topic>Receptive field</topic><topic>Remote sensing</topic><topic>Remote sensors</topic><topic>Semantics</topic><topic>Sensors</topic><topic>Task analysis</topic><topic>Visualization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zhao, Dongjie</creatorcontrib><creatorcontrib>Chen, Yaxiong</creatorcontrib><creatorcontrib>Xiong, Shengwu</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE/IET Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Water Resources Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy &amp; Non-Living Resources</collection><collection>Civil Engineering Abstracts</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) Professional</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Directory of Open Access Journals</collection><jtitle>IEEE journal of selected topics in applied earth observations and remote sensing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zhao, Dongjie</au><au>Chen, Yaxiong</au><au>Xiong, Shengwu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Multiscale Context Deep Hashing for Remote Sensing Image Retrieval</atitle><jtitle>IEEE journal of selected topics in applied earth observations and remote sensing</jtitle><stitle>JSTARS</stitle><date>2023</date><risdate>2023</risdate><volume>16</volume><spage>7163</spage><epage>7172</epage><pages>7163-7172</pages><issn>1939-1404</issn><eissn>2151-1535</eissn><coden>IJSTHZ</coden><abstract>With the advancement of remote sensing satellites and sensor technology, the quantity and diversity of remote sensing imagery have exhibited a sustained trend of growth. Remote sensing image retrieval has gained significant attention in the realm of remote sensing. Hashing methods have been widely applied in remote sensing image retrieval due to their high computational efficiency, low storage cost, and effective performance. However, existing remote sensing image retrieval methods often struggle to accurately capture the intricate information of remote sensing images. They often lack high attention to key features. The neglect of multiscale and saliency information in remote sensing images can result in feature loss and difficulties in maintaining the balance of hash codes. In response to the issues, we introduce a multiscale context deep hashing network (MSCDH). First, we can obtain finer-grained multi-scale features and achieve a larger receptive field by incorporating the proposed multiscale residual blocks. Then, the proposed multicontext attention modules can increase the perceptual field and suppress the interference from irrelevant information by aggregating contextual information along channels and spatial dimensions. The experimental results on the UCMerced dataset and WHU-RS dataset demonstrate that the proposed method achieves state-of-the-art retrieval performance.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/JSTARS.2023.3298990</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0002-2903-6723</orcidid><orcidid>https://orcid.org/0000-0002-4006-7029</orcidid><orcidid>https://orcid.org/0000-0002-9750-8032</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1939-1404
ispartof IEEE journal of selected topics in applied earth observations and remote sensing, 2023, Vol.16, p.7163-7172
issn 1939-1404
2151-1535
language eng
recordid cdi_crossref_primary_10_1109_JSTARS_2023_3298990
source Alma/SFX Local Collection
subjects Attention mechanism
Context
Datasets
deep hash
Feature extraction
Image retrieval
multiscale context information
Receptive field
Remote sensing
Remote sensors
Semantics
Sensors
Task analysis
Visualization
title Multiscale Context Deep Hashing for Remote Sensing Image Retrieval
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-02T17%3A45%3A26IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Multiscale%20Context%20Deep%20Hashing%20for%20Remote%20Sensing%20Image%20Retrieval&rft.jtitle=IEEE%20journal%20of%20selected%20topics%20in%20applied%20earth%20observations%20and%20remote%20sensing&rft.au=Zhao,%20Dongjie&rft.date=2023&rft.volume=16&rft.spage=7163&rft.epage=7172&rft.pages=7163-7172&rft.issn=1939-1404&rft.eissn=2151-1535&rft.coden=IJSTHZ&rft_id=info:doi/10.1109/JSTARS.2023.3298990&rft_dat=%3Cproquest_cross%3E2849104293%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c409t-bb9a51c94d2390d9daef36d91f10f42c406fbcada652bbc28a36f5ae5aebc3e13%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2849104293&rft_id=info:pmid/&rft_ieee_id=10195178&rfr_iscdi=true