Loading…
DGMA2-Net: A Difference-Guided Multiscale Aggregation Attention Network for Remote Sensing Change Detection
Remote sensing change detection (RSCD) focuses on identifying regions that have undergone changes between two remote sensing images captured at different times. Recently, convolutional neural networks (CNNs) have shown promising results in the challenging task of RSCD. However, these methods do not...
Saved in:
Published in: | IEEE transactions on geoscience and remote sensing 2024, Vol.62, p.1-16 |
---|---|
Main Authors: | , , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 16 |
container_issue | |
container_start_page | 1 |
container_title | IEEE transactions on geoscience and remote sensing |
container_volume | 62 |
creator | Ying, Zilu Tan, Zijun Zhai, Yikui Jia, Xudong Li, Wenba Zeng, Junying Genovese, Angelo Piuri, Vincenzo Scotti, Fabio |
description | Remote sensing change detection (RSCD) focuses on identifying regions that have undergone changes between two remote sensing images captured at different times. Recently, convolutional neural networks (CNNs) have shown promising results in the challenging task of RSCD. However, these methods do not efficiently fuse bitemporal features and extract useful information that is beneficial to subsequent RSCD tasks. In addition, they did not consider multilevel feature interactions in feature aggregation and ignore relationships between difference features and bitemporal features, which, thus, affects the RSCD results. To address the above problems, a difference-guided multiscale aggregation attention network, DGMA2-Net, is developed. Bitemporal features at different levels are extracted through a Siamese convolutional network and a multiscale difference fusion module (MDFM) is then created to fuse bitemporal features and extract, in a multiscale manner, difference features containing rich contextual information. After the MDFM treatment, two difference aggregation modules (DAMs) are used to aggregate difference features at different levels for multilevel feature interactions. The features through DAMs are sent to the difference-enhanced attention modules (DEAMs) to strengthen the connections between bitemporal features and difference features and further refine change features. Finally, refined change features are superimposed from deep to shallow and a change map is produced. In validating the effectiveness of DGMA2-Net, a series of experiments are conducted on three public RSCD benchmark datasets [LEVIR building change detection dataset (LEVIR-CD), Wuhan University building change detection dataset (BCDD), and Sun Yat-Sen University dataset (SYSU-CD)]. The experimental results demonstrate that DGMA2-Net surpasses the current eight state-of-the-art methods in RSCD. Our code is released at https://github.com/yikuizhai/DGMA2-Net . |
doi_str_mv | 10.1109/TGRS.2024.3390206 |
format | article |
fullrecord | <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_10504297</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10504297</ieee_id><sourcerecordid>3044631456</sourcerecordid><originalsourceid>FETCH-LOGICAL-i134t-fbd0aa6bc285ad762b330dad5ef20c600b26005ee2856985ffa4220cf4612dc63</originalsourceid><addsrcrecordid>eNotj1tLw0AQhRdRsFZ_gODDgs-ps9c2voVWo9AqtPU5bJLZuL0kdbNB_Pem1peZgfPNORxCbhmMGIP4YZ0uVyMOXI6EiIGDPiMDptQkAi3lORkAi3XEJzG_JFdtuwFgUrHxgGxn6SLh0RuGR5rQmbMWPdYFRmnnSizpotsF1xZmhzSpKo-VCa6paRIC1n9X__nd-C21jadL3DcB6Qrr1tUVnX6aukI6w4DFkb0mF9bsWrz530Py8fy0nr5E8_f0dZrMI8eEDJHNSzBG5wWfKFOONc-FgNKUCi2HQgPkvB8Ksdd1PFHWGsl7xUrNeFloMST3J9-Db746bEO2aTpf95GZACm16LsfqbsT5RAxO3i3N_4nY6BA8ngsfgEOFmNP</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3044631456</pqid></control><display><type>article</type><title>DGMA2-Net: A Difference-Guided Multiscale Aggregation Attention Network for Remote Sensing Change Detection</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Ying, Zilu ; Tan, Zijun ; Zhai, Yikui ; Jia, Xudong ; Li, Wenba ; Zeng, Junying ; Genovese, Angelo ; Piuri, Vincenzo ; Scotti, Fabio</creator><creatorcontrib>Ying, Zilu ; Tan, Zijun ; Zhai, Yikui ; Jia, Xudong ; Li, Wenba ; Zeng, Junying ; Genovese, Angelo ; Piuri, Vincenzo ; Scotti, Fabio</creatorcontrib><description>Remote sensing change detection (RSCD) focuses on identifying regions that have undergone changes between two remote sensing images captured at different times. Recently, convolutional neural networks (CNNs) have shown promising results in the challenging task of RSCD. However, these methods do not efficiently fuse bitemporal features and extract useful information that is beneficial to subsequent RSCD tasks. In addition, they did not consider multilevel feature interactions in feature aggregation and ignore relationships between difference features and bitemporal features, which, thus, affects the RSCD results. To address the above problems, a difference-guided multiscale aggregation attention network, DGMA2-Net, is developed. Bitemporal features at different levels are extracted through a Siamese convolutional network and a multiscale difference fusion module (MDFM) is then created to fuse bitemporal features and extract, in a multiscale manner, difference features containing rich contextual information. After the MDFM treatment, two difference aggregation modules (DAMs) are used to aggregate difference features at different levels for multilevel feature interactions. The features through DAMs are sent to the difference-enhanced attention modules (DEAMs) to strengthen the connections between bitemporal features and difference features and further refine change features. Finally, refined change features are superimposed from deep to shallow and a change map is produced. In validating the effectiveness of DGMA2-Net, a series of experiments are conducted on three public RSCD benchmark datasets [LEVIR building change detection dataset (LEVIR-CD), Wuhan University building change detection dataset (BCDD), and Sun Yat-Sen University dataset (SYSU-CD)]. The experimental results demonstrate that DGMA2-Net surpasses the current eight state-of-the-art methods in RSCD. Our code is released at https://github.com/yikuizhai/DGMA2-Net .</description><identifier>ISSN: 0196-2892</identifier><identifier>EISSN: 1558-0644</identifier><identifier>DOI: 10.1109/TGRS.2024.3390206</identifier><identifier>CODEN: IGRSD2</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Aggregation ; Artificial neural networks ; Change detection ; Dams ; Datasets ; Deep learning ; Difference aggregation module (DAM) ; difference-enhanced attention module (DEAM) ; Feature extraction ; Fuses ; Information processing ; Modules ; multiscale difference fusion module (MDFM) ; Neural networks ; Noise ; Remote sensing ; remote sensing change detection (RSCD) ; Semantics ; Task analysis</subject><ispartof>IEEE transactions on geoscience and remote sensing, 2024, Vol.62, p.1-16</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><orcidid>0000-0003-3178-8198 ; 0000-0002-4277-3701 ; 0000-0003-0154-9743 ; 0000-0002-7559-0637 ; 0000-0002-3683-4723 ; 0000-0001-7911-8869 ; 0000-0002-3074-5586</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10504297$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,4024,27923,27924,27925,54796</link.rule.ids></links><search><creatorcontrib>Ying, Zilu</creatorcontrib><creatorcontrib>Tan, Zijun</creatorcontrib><creatorcontrib>Zhai, Yikui</creatorcontrib><creatorcontrib>Jia, Xudong</creatorcontrib><creatorcontrib>Li, Wenba</creatorcontrib><creatorcontrib>Zeng, Junying</creatorcontrib><creatorcontrib>Genovese, Angelo</creatorcontrib><creatorcontrib>Piuri, Vincenzo</creatorcontrib><creatorcontrib>Scotti, Fabio</creatorcontrib><title>DGMA2-Net: A Difference-Guided Multiscale Aggregation Attention Network for Remote Sensing Change Detection</title><title>IEEE transactions on geoscience and remote sensing</title><addtitle>TGRS</addtitle><description>Remote sensing change detection (RSCD) focuses on identifying regions that have undergone changes between two remote sensing images captured at different times. Recently, convolutional neural networks (CNNs) have shown promising results in the challenging task of RSCD. However, these methods do not efficiently fuse bitemporal features and extract useful information that is beneficial to subsequent RSCD tasks. In addition, they did not consider multilevel feature interactions in feature aggregation and ignore relationships between difference features and bitemporal features, which, thus, affects the RSCD results. To address the above problems, a difference-guided multiscale aggregation attention network, DGMA2-Net, is developed. Bitemporal features at different levels are extracted through a Siamese convolutional network and a multiscale difference fusion module (MDFM) is then created to fuse bitemporal features and extract, in a multiscale manner, difference features containing rich contextual information. After the MDFM treatment, two difference aggregation modules (DAMs) are used to aggregate difference features at different levels for multilevel feature interactions. The features through DAMs are sent to the difference-enhanced attention modules (DEAMs) to strengthen the connections between bitemporal features and difference features and further refine change features. Finally, refined change features are superimposed from deep to shallow and a change map is produced. In validating the effectiveness of DGMA2-Net, a series of experiments are conducted on three public RSCD benchmark datasets [LEVIR building change detection dataset (LEVIR-CD), Wuhan University building change detection dataset (BCDD), and Sun Yat-Sen University dataset (SYSU-CD)]. The experimental results demonstrate that DGMA2-Net surpasses the current eight state-of-the-art methods in RSCD. Our code is released at https://github.com/yikuizhai/DGMA2-Net .</description><subject>Aggregation</subject><subject>Artificial neural networks</subject><subject>Change detection</subject><subject>Dams</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Difference aggregation module (DAM)</subject><subject>difference-enhanced attention module (DEAM)</subject><subject>Feature extraction</subject><subject>Fuses</subject><subject>Information processing</subject><subject>Modules</subject><subject>multiscale difference fusion module (MDFM)</subject><subject>Neural networks</subject><subject>Noise</subject><subject>Remote sensing</subject><subject>remote sensing change detection (RSCD)</subject><subject>Semantics</subject><subject>Task analysis</subject><issn>0196-2892</issn><issn>1558-0644</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNotj1tLw0AQhRdRsFZ_gODDgs-ps9c2voVWo9AqtPU5bJLZuL0kdbNB_Pem1peZgfPNORxCbhmMGIP4YZ0uVyMOXI6EiIGDPiMDptQkAi3lORkAi3XEJzG_JFdtuwFgUrHxgGxn6SLh0RuGR5rQmbMWPdYFRmnnSizpotsF1xZmhzSpKo-VCa6paRIC1n9X__nd-C21jadL3DcB6Qrr1tUVnX6aukI6w4DFkb0mF9bsWrz530Py8fy0nr5E8_f0dZrMI8eEDJHNSzBG5wWfKFOONc-FgNKUCi2HQgPkvB8Ksdd1PFHWGsl7xUrNeFloMST3J9-Db746bEO2aTpf95GZACm16LsfqbsT5RAxO3i3N_4nY6BA8ngsfgEOFmNP</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Ying, Zilu</creator><creator>Tan, Zijun</creator><creator>Zhai, Yikui</creator><creator>Jia, Xudong</creator><creator>Li, Wenba</creator><creator>Zeng, Junying</creator><creator>Genovese, Angelo</creator><creator>Piuri, Vincenzo</creator><creator>Scotti, Fabio</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>7UA</scope><scope>8FD</scope><scope>C1K</scope><scope>F1W</scope><scope>FR3</scope><scope>H8D</scope><scope>H96</scope><scope>KR7</scope><scope>L.G</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0003-3178-8198</orcidid><orcidid>https://orcid.org/0000-0002-4277-3701</orcidid><orcidid>https://orcid.org/0000-0003-0154-9743</orcidid><orcidid>https://orcid.org/0000-0002-7559-0637</orcidid><orcidid>https://orcid.org/0000-0002-3683-4723</orcidid><orcidid>https://orcid.org/0000-0001-7911-8869</orcidid><orcidid>https://orcid.org/0000-0002-3074-5586</orcidid></search><sort><creationdate>2024</creationdate><title>DGMA2-Net: A Difference-Guided Multiscale Aggregation Attention Network for Remote Sensing Change Detection</title><author>Ying, Zilu ; Tan, Zijun ; Zhai, Yikui ; Jia, Xudong ; Li, Wenba ; Zeng, Junying ; Genovese, Angelo ; Piuri, Vincenzo ; Scotti, Fabio</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i134t-fbd0aa6bc285ad762b330dad5ef20c600b26005ee2856985ffa4220cf4612dc63</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Aggregation</topic><topic>Artificial neural networks</topic><topic>Change detection</topic><topic>Dams</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Difference aggregation module (DAM)</topic><topic>difference-enhanced attention module (DEAM)</topic><topic>Feature extraction</topic><topic>Fuses</topic><topic>Information processing</topic><topic>Modules</topic><topic>multiscale difference fusion module (MDFM)</topic><topic>Neural networks</topic><topic>Noise</topic><topic>Remote sensing</topic><topic>remote sensing change detection (RSCD)</topic><topic>Semantics</topic><topic>Task analysis</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ying, Zilu</creatorcontrib><creatorcontrib>Tan, Zijun</creatorcontrib><creatorcontrib>Zhai, Yikui</creatorcontrib><creatorcontrib>Jia, Xudong</creatorcontrib><creatorcontrib>Li, Wenba</creatorcontrib><creatorcontrib>Zeng, Junying</creatorcontrib><creatorcontrib>Genovese, Angelo</creatorcontrib><creatorcontrib>Piuri, Vincenzo</creatorcontrib><creatorcontrib>Scotti, Fabio</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE/IET Electronic Library (IEL)</collection><collection>Water Resources Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources</collection><collection>Civil Engineering Abstracts</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) Professional</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE transactions on geoscience and remote sensing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ying, Zilu</au><au>Tan, Zijun</au><au>Zhai, Yikui</au><au>Jia, Xudong</au><au>Li, Wenba</au><au>Zeng, Junying</au><au>Genovese, Angelo</au><au>Piuri, Vincenzo</au><au>Scotti, Fabio</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>DGMA2-Net: A Difference-Guided Multiscale Aggregation Attention Network for Remote Sensing Change Detection</atitle><jtitle>IEEE transactions on geoscience and remote sensing</jtitle><stitle>TGRS</stitle><date>2024</date><risdate>2024</risdate><volume>62</volume><spage>1</spage><epage>16</epage><pages>1-16</pages><issn>0196-2892</issn><eissn>1558-0644</eissn><coden>IGRSD2</coden><abstract>Remote sensing change detection (RSCD) focuses on identifying regions that have undergone changes between two remote sensing images captured at different times. Recently, convolutional neural networks (CNNs) have shown promising results in the challenging task of RSCD. However, these methods do not efficiently fuse bitemporal features and extract useful information that is beneficial to subsequent RSCD tasks. In addition, they did not consider multilevel feature interactions in feature aggregation and ignore relationships between difference features and bitemporal features, which, thus, affects the RSCD results. To address the above problems, a difference-guided multiscale aggregation attention network, DGMA2-Net, is developed. Bitemporal features at different levels are extracted through a Siamese convolutional network and a multiscale difference fusion module (MDFM) is then created to fuse bitemporal features and extract, in a multiscale manner, difference features containing rich contextual information. After the MDFM treatment, two difference aggregation modules (DAMs) are used to aggregate difference features at different levels for multilevel feature interactions. The features through DAMs are sent to the difference-enhanced attention modules (DEAMs) to strengthen the connections between bitemporal features and difference features and further refine change features. Finally, refined change features are superimposed from deep to shallow and a change map is produced. In validating the effectiveness of DGMA2-Net, a series of experiments are conducted on three public RSCD benchmark datasets [LEVIR building change detection dataset (LEVIR-CD), Wuhan University building change detection dataset (BCDD), and Sun Yat-Sen University dataset (SYSU-CD)]. The experimental results demonstrate that DGMA2-Net surpasses the current eight state-of-the-art methods in RSCD. Our code is released at https://github.com/yikuizhai/DGMA2-Net .</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TGRS.2024.3390206</doi><tpages>16</tpages><orcidid>https://orcid.org/0000-0003-3178-8198</orcidid><orcidid>https://orcid.org/0000-0002-4277-3701</orcidid><orcidid>https://orcid.org/0000-0003-0154-9743</orcidid><orcidid>https://orcid.org/0000-0002-7559-0637</orcidid><orcidid>https://orcid.org/0000-0002-3683-4723</orcidid><orcidid>https://orcid.org/0000-0001-7911-8869</orcidid><orcidid>https://orcid.org/0000-0002-3074-5586</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0196-2892 |
ispartof | IEEE transactions on geoscience and remote sensing, 2024, Vol.62, p.1-16 |
issn | 0196-2892 1558-0644 |
language | eng |
recordid | cdi_ieee_primary_10504297 |
source | IEEE Electronic Library (IEL) Journals |
subjects | Aggregation Artificial neural networks Change detection Dams Datasets Deep learning Difference aggregation module (DAM) difference-enhanced attention module (DEAM) Feature extraction Fuses Information processing Modules multiscale difference fusion module (MDFM) Neural networks Noise Remote sensing remote sensing change detection (RSCD) Semantics Task analysis |
title | DGMA2-Net: A Difference-Guided Multiscale Aggregation Attention Network for Remote Sensing Change Detection |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T20%3A13%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=DGMA2-Net:%20A%20Difference-Guided%20Multiscale%20Aggregation%20Attention%20Network%20for%20Remote%20Sensing%20Change%20Detection&rft.jtitle=IEEE%20transactions%20on%20geoscience%20and%20remote%20sensing&rft.au=Ying,%20Zilu&rft.date=2024&rft.volume=62&rft.spage=1&rft.epage=16&rft.pages=1-16&rft.issn=0196-2892&rft.eissn=1558-0644&rft.coden=IGRSD2&rft_id=info:doi/10.1109/TGRS.2024.3390206&rft_dat=%3Cproquest_ieee_%3E3044631456%3C/proquest_ieee_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i134t-fbd0aa6bc285ad762b330dad5ef20c600b26005ee2856985ffa4220cf4612dc63%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3044631456&rft_id=info:pmid/&rft_ieee_id=10504297&rfr_iscdi=true |