Loading…
Automatic Segmentation Annotation of Space Target Using Segment Anything Model and Object Detection Prompts
The efficacy of intelligent recognition and inference in space target imagery is profoundly dependent on the dataset's scale and its quality. Given the rarity of such images, laborintensive manual annotation has been the traditional recourse. To address the demand for automatic segmentation ann...
Saved in:
Published in: | IEEE transactions on aerospace and electronic systems 2024-12, p.1-15 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The efficacy of intelligent recognition and inference in space target imagery is profoundly dependent on the dataset's scale and its quality. Given the rarity of such images, laborintensive manual annotation has been the traditional recourse. To address the demand for automatic segmentation annotation in space target imagery, we propose a groundbreaking framework utilizing the Segment Anything Model (SAM), which leverages object detection prompts to generate precise instance boundaries. Our framework incorporates a scalable set of prompts for foreground and background classes, maximizing the zero-shot potential of our visual model. Additionally, we address frequent issues with anomalous mask regions in annotations by implementing a easily integrated mask quality enhancement strategy, which has led to a precision increase of 1.0%-2.0% and an 86.7% improvement in quality. Through extensive evaluation across various public space target image datasets, our method has proven highly accurate and generalizable. To support further research, we are releasing both our specialized dataset and the project code on: https://github.com/zhang-xaerospace/SpaceTarget-Segmentation-Dataset , aiming to mitigate the limitations imposed by the scarcity of annotated training data in current on-orbit machine vision systems. |
---|---|
ISSN: | 0018-9251 |
DOI: | 10.1109/TAES.2024.3512533 |