Loading…
Few‐shot learning for satellite characterisation from synthetic inverse synthetic aperture radar images
Space situational awareness systems primarily focus on detecting and tracking space objects, providing crucial positional data. However, understanding the complex space domain requires characterising satellites, often involving estimation of bus and solar panel sizes. While inverse synthetic apertur...
Saved in:
Published in: | IET radar, sonar & navigation sonar & navigation, 2024-04, Vol.18 (4), p.649-656 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | cdi_FETCH-LOGICAL-c4016-fed302d83fe46697ae436f4585a8bf8141eb7afaa8b083da5c81b9ceffc149123 |
container_end_page | 656 |
container_issue | 4 |
container_start_page | 649 |
container_title | IET radar, sonar & navigation |
container_volume | 18 |
creator | Heslinga, Friso G. Uysal, Faruk Rooij, Sabina B. Berberich, Sven Caro Cuenca, Miguel |
description | Space situational awareness systems primarily focus on detecting and tracking space objects, providing crucial positional data. However, understanding the complex space domain requires characterising satellites, often involving estimation of bus and solar panel sizes. While inverse synthetic aperture radar allows satellite visualisation, developing deep learning models for substructure segmentation in inverse synthetic aperture radar images is challenging due to the high costs and hardware requirements. The authors present a framework addressing the scarcity of inverse synthetic aperture radar data through synthetic training data. The authors approach utilises a few‐shot domain adaptation technique, leveraging thousands of rapidly simulated low‐fidelity inverse synthetic aperture radar images and a small set of inverse synthetic aperture radar images from the target domain. The authors validate their framework by simulating a real‐case scenario, fine‐tuning a deep learning‐based segmentation model using four inverse synthetic aperture radar images generated through the backprojection algorithm from simulated raw radar data (simulated at the analogue‐to‐digital converter level) as the target domain. The authors results demonstrate the effectiveness of the proposed framework, significantly improving inverse synthetic aperture radar image segmentation across diverse domains. This enhancement enables accurate characterisation of satellite bus and solar panel sizes as well as their orientation, even when the images are sourced from different domains.
Inverse synthetic aperture radar (ISAR) allows for visualisation of satellites, but development of deep learning‐based segmentation models can be challenging because of the limited amount of real ISAR data. An image analysis framework based on simulated data and few‐shot learning for domain‐adaptation is proposed. We obtain segmentation results that enable characterisation of the satellite bus and solar panel size and their orientation. |
doi_str_mv | 10.1049/rsn2.12516 |
format | article |
fullrecord | <record><control><sourceid>wiley_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_a75f5ff91e1e49dfb24728cf845e2e1a</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_a75f5ff91e1e49dfb24728cf845e2e1a</doaj_id><sourcerecordid>RSN212516</sourcerecordid><originalsourceid>FETCH-LOGICAL-c4016-fed302d83fe46697ae436f4585a8bf8141eb7afaa8b083da5c81b9ceffc149123</originalsourceid><addsrcrecordid>eNp9kM9Kw0AQxoMoWKsXn2DPQmtms5tsjlKsCkXBP-dlupltV2JSZqOlNx_BZ_RJjK2IJ08z8_HNb4YvSU4hHUOqynOOjRyD1JDvJQMoNIxMUcr9396ow-Qoxuc01TpX5SAJU1p_vn_EZduJmpCb0CyEb1lE7KiuQ0fCLZHRdcSh10LbCM_ti4ibpltSF5wIzRtxpD8Kroi7VybBWCGL8IILisfJgcc60slPHSZP08vHyfVodnd1M7mYjZxKIR95qrJUVibzpPK8LJBUlnuljUYz9wYU0LxAj_2UmqxC7QzMS0feO1AlyGyY3Oy4VYvPdsX9dd7YFoPdCi0vLHL_ZU0WC-219yUQkCorP5eqkMZ5ozRJAuxZZzuW4zZGJv_Lg9R-B26_A7fbwHsz7MzrUNPmH6e9f7iVu50vxtWHeQ</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Few‐shot learning for satellite characterisation from synthetic inverse synthetic aperture radar images</title><source>Open Access: Wiley-Blackwell Open Access Journals</source><source>IET Digital Library Journals</source><creator>Heslinga, Friso G. ; Uysal, Faruk ; Rooij, Sabina B. ; Berberich, Sven ; Caro Cuenca, Miguel</creator><creatorcontrib>Heslinga, Friso G. ; Uysal, Faruk ; Rooij, Sabina B. ; Berberich, Sven ; Caro Cuenca, Miguel</creatorcontrib><description>Space situational awareness systems primarily focus on detecting and tracking space objects, providing crucial positional data. However, understanding the complex space domain requires characterising satellites, often involving estimation of bus and solar panel sizes. While inverse synthetic aperture radar allows satellite visualisation, developing deep learning models for substructure segmentation in inverse synthetic aperture radar images is challenging due to the high costs and hardware requirements. The authors present a framework addressing the scarcity of inverse synthetic aperture radar data through synthetic training data. The authors approach utilises a few‐shot domain adaptation technique, leveraging thousands of rapidly simulated low‐fidelity inverse synthetic aperture radar images and a small set of inverse synthetic aperture radar images from the target domain. The authors validate their framework by simulating a real‐case scenario, fine‐tuning a deep learning‐based segmentation model using four inverse synthetic aperture radar images generated through the backprojection algorithm from simulated raw radar data (simulated at the analogue‐to‐digital converter level) as the target domain. The authors results demonstrate the effectiveness of the proposed framework, significantly improving inverse synthetic aperture radar image segmentation across diverse domains. This enhancement enables accurate characterisation of satellite bus and solar panel sizes as well as their orientation, even when the images are sourced from different domains.
Inverse synthetic aperture radar (ISAR) allows for visualisation of satellites, but development of deep learning‐based segmentation models can be challenging because of the limited amount of real ISAR data. An image analysis framework based on simulated data and few‐shot learning for domain‐adaptation is proposed. We obtain segmentation results that enable characterisation of the satellite bus and solar panel size and their orientation.</description><identifier>ISSN: 1751-8784</identifier><identifier>EISSN: 1751-8792</identifier><identifier>DOI: 10.1049/rsn2.12516</identifier><language>eng</language><publisher>Wiley</publisher><subject>artificial intelligence ; image segmentation ; inverse synthetic aperture radar (ISAR) ; satellite tracking</subject><ispartof>IET radar, sonar & navigation, 2024-04, Vol.18 (4), p.649-656</ispartof><rights>2024 TNO. published by John Wiley & Sons Ltd on behalf of The Institution of Engineering and Technology.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c4016-fed302d83fe46697ae436f4585a8bf8141eb7afaa8b083da5c81b9ceffc149123</cites><orcidid>0009-0008-5058-5103 ; 0009-0001-0948-8175 ; 0000-0002-4518-5649 ; 0000-0002-8383-307X ; 0009-0006-9327-0043</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1049%2Frsn2.12516$$EPDF$$P50$$Gwiley$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1049%2Frsn2.12516$$EHTML$$P50$$Gwiley$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,11562,27924,27925,46052,46476</link.rule.ids></links><search><creatorcontrib>Heslinga, Friso G.</creatorcontrib><creatorcontrib>Uysal, Faruk</creatorcontrib><creatorcontrib>Rooij, Sabina B.</creatorcontrib><creatorcontrib>Berberich, Sven</creatorcontrib><creatorcontrib>Caro Cuenca, Miguel</creatorcontrib><title>Few‐shot learning for satellite characterisation from synthetic inverse synthetic aperture radar images</title><title>IET radar, sonar & navigation</title><description>Space situational awareness systems primarily focus on detecting and tracking space objects, providing crucial positional data. However, understanding the complex space domain requires characterising satellites, often involving estimation of bus and solar panel sizes. While inverse synthetic aperture radar allows satellite visualisation, developing deep learning models for substructure segmentation in inverse synthetic aperture radar images is challenging due to the high costs and hardware requirements. The authors present a framework addressing the scarcity of inverse synthetic aperture radar data through synthetic training data. The authors approach utilises a few‐shot domain adaptation technique, leveraging thousands of rapidly simulated low‐fidelity inverse synthetic aperture radar images and a small set of inverse synthetic aperture radar images from the target domain. The authors validate their framework by simulating a real‐case scenario, fine‐tuning a deep learning‐based segmentation model using four inverse synthetic aperture radar images generated through the backprojection algorithm from simulated raw radar data (simulated at the analogue‐to‐digital converter level) as the target domain. The authors results demonstrate the effectiveness of the proposed framework, significantly improving inverse synthetic aperture radar image segmentation across diverse domains. This enhancement enables accurate characterisation of satellite bus and solar panel sizes as well as their orientation, even when the images are sourced from different domains.
Inverse synthetic aperture radar (ISAR) allows for visualisation of satellites, but development of deep learning‐based segmentation models can be challenging because of the limited amount of real ISAR data. An image analysis framework based on simulated data and few‐shot learning for domain‐adaptation is proposed. We obtain segmentation results that enable characterisation of the satellite bus and solar panel size and their orientation.</description><subject>artificial intelligence</subject><subject>image segmentation</subject><subject>inverse synthetic aperture radar (ISAR)</subject><subject>satellite tracking</subject><issn>1751-8784</issn><issn>1751-8792</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>24P</sourceid><sourceid>DOA</sourceid><recordid>eNp9kM9Kw0AQxoMoWKsXn2DPQmtms5tsjlKsCkXBP-dlupltV2JSZqOlNx_BZ_RJjK2IJ08z8_HNb4YvSU4hHUOqynOOjRyD1JDvJQMoNIxMUcr9396ow-Qoxuc01TpX5SAJU1p_vn_EZduJmpCb0CyEb1lE7KiuQ0fCLZHRdcSh10LbCM_ti4ibpltSF5wIzRtxpD8Kroi7VybBWCGL8IILisfJgcc60slPHSZP08vHyfVodnd1M7mYjZxKIR95qrJUVibzpPK8LJBUlnuljUYz9wYU0LxAj_2UmqxC7QzMS0feO1AlyGyY3Oy4VYvPdsX9dd7YFoPdCi0vLHL_ZU0WC-219yUQkCorP5eqkMZ5ozRJAuxZZzuW4zZGJv_Lg9R-B26_A7fbwHsz7MzrUNPmH6e9f7iVu50vxtWHeQ</recordid><startdate>202404</startdate><enddate>202404</enddate><creator>Heslinga, Friso G.</creator><creator>Uysal, Faruk</creator><creator>Rooij, Sabina B.</creator><creator>Berberich, Sven</creator><creator>Caro Cuenca, Miguel</creator><general>Wiley</general><scope>24P</scope><scope>WIN</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>DOA</scope><orcidid>https://orcid.org/0009-0008-5058-5103</orcidid><orcidid>https://orcid.org/0009-0001-0948-8175</orcidid><orcidid>https://orcid.org/0000-0002-4518-5649</orcidid><orcidid>https://orcid.org/0000-0002-8383-307X</orcidid><orcidid>https://orcid.org/0009-0006-9327-0043</orcidid></search><sort><creationdate>202404</creationdate><title>Few‐shot learning for satellite characterisation from synthetic inverse synthetic aperture radar images</title><author>Heslinga, Friso G. ; Uysal, Faruk ; Rooij, Sabina B. ; Berberich, Sven ; Caro Cuenca, Miguel</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c4016-fed302d83fe46697ae436f4585a8bf8141eb7afaa8b083da5c81b9ceffc149123</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>artificial intelligence</topic><topic>image segmentation</topic><topic>inverse synthetic aperture radar (ISAR)</topic><topic>satellite tracking</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Heslinga, Friso G.</creatorcontrib><creatorcontrib>Uysal, Faruk</creatorcontrib><creatorcontrib>Rooij, Sabina B.</creatorcontrib><creatorcontrib>Berberich, Sven</creatorcontrib><creatorcontrib>Caro Cuenca, Miguel</creatorcontrib><collection>Open Access: Wiley-Blackwell Open Access Journals</collection><collection>Wiley-Blackwell Open Access Backfiles (Open Access)</collection><collection>CrossRef</collection><collection>Directory of Open Access Journals</collection><jtitle>IET radar, sonar & navigation</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Heslinga, Friso G.</au><au>Uysal, Faruk</au><au>Rooij, Sabina B.</au><au>Berberich, Sven</au><au>Caro Cuenca, Miguel</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Few‐shot learning for satellite characterisation from synthetic inverse synthetic aperture radar images</atitle><jtitle>IET radar, sonar & navigation</jtitle><date>2024-04</date><risdate>2024</risdate><volume>18</volume><issue>4</issue><spage>649</spage><epage>656</epage><pages>649-656</pages><issn>1751-8784</issn><eissn>1751-8792</eissn><abstract>Space situational awareness systems primarily focus on detecting and tracking space objects, providing crucial positional data. However, understanding the complex space domain requires characterising satellites, often involving estimation of bus and solar panel sizes. While inverse synthetic aperture radar allows satellite visualisation, developing deep learning models for substructure segmentation in inverse synthetic aperture radar images is challenging due to the high costs and hardware requirements. The authors present a framework addressing the scarcity of inverse synthetic aperture radar data through synthetic training data. The authors approach utilises a few‐shot domain adaptation technique, leveraging thousands of rapidly simulated low‐fidelity inverse synthetic aperture radar images and a small set of inverse synthetic aperture radar images from the target domain. The authors validate their framework by simulating a real‐case scenario, fine‐tuning a deep learning‐based segmentation model using four inverse synthetic aperture radar images generated through the backprojection algorithm from simulated raw radar data (simulated at the analogue‐to‐digital converter level) as the target domain. The authors results demonstrate the effectiveness of the proposed framework, significantly improving inverse synthetic aperture radar image segmentation across diverse domains. This enhancement enables accurate characterisation of satellite bus and solar panel sizes as well as their orientation, even when the images are sourced from different domains.
Inverse synthetic aperture radar (ISAR) allows for visualisation of satellites, but development of deep learning‐based segmentation models can be challenging because of the limited amount of real ISAR data. An image analysis framework based on simulated data and few‐shot learning for domain‐adaptation is proposed. We obtain segmentation results that enable characterisation of the satellite bus and solar panel size and their orientation.</abstract><pub>Wiley</pub><doi>10.1049/rsn2.12516</doi><tpages>8</tpages><orcidid>https://orcid.org/0009-0008-5058-5103</orcidid><orcidid>https://orcid.org/0009-0001-0948-8175</orcidid><orcidid>https://orcid.org/0000-0002-4518-5649</orcidid><orcidid>https://orcid.org/0000-0002-8383-307X</orcidid><orcidid>https://orcid.org/0009-0006-9327-0043</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1751-8784 |
ispartof | IET radar, sonar & navigation, 2024-04, Vol.18 (4), p.649-656 |
issn | 1751-8784 1751-8792 |
language | eng |
recordid | cdi_doaj_primary_oai_doaj_org_article_a75f5ff91e1e49dfb24728cf845e2e1a |
source | Open Access: Wiley-Blackwell Open Access Journals; IET Digital Library Journals |
subjects | artificial intelligence image segmentation inverse synthetic aperture radar (ISAR) satellite tracking |
title | Few‐shot learning for satellite characterisation from synthetic inverse synthetic aperture radar images |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-30T20%3A16%3A59IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-wiley_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Few%E2%80%90shot%20learning%20for%20satellite%20characterisation%20from%20synthetic%20inverse%20synthetic%20aperture%20radar%20images&rft.jtitle=IET%20radar,%20sonar%20&%20navigation&rft.au=Heslinga,%20Friso%20G.&rft.date=2024-04&rft.volume=18&rft.issue=4&rft.spage=649&rft.epage=656&rft.pages=649-656&rft.issn=1751-8784&rft.eissn=1751-8792&rft_id=info:doi/10.1049/rsn2.12516&rft_dat=%3Cwiley_doaj_%3ERSN212516%3C/wiley_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c4016-fed302d83fe46697ae436f4585a8bf8141eb7afaa8b083da5c81b9ceffc149123%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |