Loading…

Two-stage ship detection at long distances based on deep learning and slicing technique

Ship detection over long distances is crucial for the visual perception of intelligent ships. Since traditional image processing-based methods are not robust, deep learning-based image recognition methods can automatically obtain the features of small ships. However, due to the limited pixels of shi...

Full description

Saved in:
Bibliographic Details
Published in:PloS one 2024-11, Vol.19 (11), p.e0313145
Main Authors: Gong, Yanfeng, Chen, Zihao, Tan, Jiawan, Yin, Chaozhong, Deng, Wen
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Ship detection over long distances is crucial for the visual perception of intelligent ships. Since traditional image processing-based methods are not robust, deep learning-based image recognition methods can automatically obtain the features of small ships. However, due to the limited pixels of ships over long distances, accurate features of such ships are difficult to obtain. To address this, a two-stage object detection method that combines the advantages of traditional and deep-learning methods is proposed. In the first stage, an object detection model for the sea-sky line (SSL) region is trained to select a potential region of ships. In the second stage, another object detection model for ships is trained using sliced patches containing ships. When testing, the SSL region is first detected using the trained 8th version of You Only Look Once (YOLOv8). Then, the SSL region detected is divided into several overlapping patches using the slicing technique, and another trained YOLOv8 is applied to detect ships. The experimental results showed that our method achieved 85% average precision when the intersection over union is 0.5 (AP50), and a detection speed of 75 ms per image with a pixel size of 1080Ă—640. The code is available at https://github.com/gongyanfeng/PaperCode.
ISSN:1932-6203
1932-6203
DOI:10.1371/journal.pone.0313145