Loading…
Research on Obstacle Detection and Avoidance of Autonomous Underwater Vehicle Based on Forward-Looking Sonar
Due to the complexity of the ocean environment, an autonomous underwater vehicle (AUV) is disturbed by obstacles when performing tasks. Therefore, the research on underwater obstacle detection and avoidance is particularly important. Based on the images collected by a forward-looking sonar on an AUV...
Saved in:
Published in: | IEEE transaction on neural networks and learning systems 2023-11, Vol.34 (11), p.9198-9208 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Due to the complexity of the ocean environment, an autonomous underwater vehicle (AUV) is disturbed by obstacles when performing tasks. Therefore, the research on underwater obstacle detection and avoidance is particularly important. Based on the images collected by a forward-looking sonar on an AUV, this article proposes an obstacle detection and avoidance algorithm. First, a deep learning-based obstacle candidate area detection algorithm is developed. This algorithm uses the You Only Look Once (YOLO) v3 network to determine obstacle candidate areas in a sonar image. Then, in the determined obstacle candidate areas, the obstacle detection algorithm based on the improved threshold segmentation algorithm is used to detect obstacles accurately. Finally, using the obstacle detection results obtained from the sonar images, an obstacle avoidance algorithm based on deep reinforcement learning (DRL) is developed to plan a reasonable obstacle avoidance path of an AUV. Experimental results show that the proposed algorithms improve obstacle detection accuracy and processing speed of sonar images. At the same time, the proposed algorithms ensure AUV navigation safety in a complex obstacle environment. |
---|---|
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2022.3156907 |