Loading…
Global Mask R-CNN for marine ship instance segmentation
Instance segmentation technology can provide accurate and efficient segmentation methods for visual perception of marine scenes, especially in the development of unmanned ships. However, the community lacks suitable open-source datasets. In order to address the problem of insufficient datasets, an i...
Saved in:
Published in: | Neurocomputing (Amsterdam) 2022-04, Vol.480, p.257-270 |
---|---|
Main Authors: | , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Instance segmentation technology can provide accurate and efficient segmentation methods for visual perception of marine scenes, especially in the development of unmanned ships. However, the community lacks suitable open-source datasets. In order to address the problem of insufficient datasets, an instance segmentation dataset for the marine ship was collected and labeled. Our dataset, named MariShipInsSeg, consists of 4k high-quality visible light marine ship images with 8,413 instances. Due to marine ships being photographed far away, which causes ship objects with less detail information. Therefore, a global method is adopted to make full use of global location and semantic information, which is helpful for ship instance segmentation. We proposed a new method called Global Mask R-CNN (GM R-CNN), which utilized Precise RoI Pooling and Global Mask Head aiming to preserve global information of instances for improving the performance of ship instance segmentation. Experiments on the challenging MS COCO dataset and MariShipInsSeg dataset show that Global Mask R-CNN achieves state-of-the-art performance. Without any bells and whistles, the proposed GM R-CNN achieves 38.7% mask AP on MS COCO test-dev and 48.6% mask AP on MariShipInsSeg testing sets, which are gain of 1.6% and 1.9% compared with Mask R-CNN. |
---|---|
ISSN: | 0925-2312 1872-8286 |
DOI: | 10.1016/j.neucom.2022.01.017 |