Loading…
Underwater Acoustic Target Classification Based on Dense Convolutional Neural Network
In oceanic remote sensing operations, underwater acoustic target recognition is always a difficult and extremely important task of sonar systems, especially in the condition of complex sound wave propagation characteristics. The expensively learning recognition model for big data analysis is typical...
Saved in:
Published in: | IEEE geoscience and remote sensing letters 2022, Vol.19, p.1-5 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In oceanic remote sensing operations, underwater acoustic target recognition is always a difficult and extremely important task of sonar systems, especially in the condition of complex sound wave propagation characteristics. The expensively learning recognition model for big data analysis is typically an obstacle for most traditional machine learning (ML) algorithms, whereas the convolutional neural network (CNN), a type of deep neural network, can automatically extract features for accurate classification. In this study, we propose an approach using a dense CNN model for underwater target recognition. The network architecture is designed to cleverly reuse all former feature maps to optimize classification rates under various impaired conditions while satisfying low computational cost. In addition, instead of using time-frequency spectrogram images, the proposed scheme allows directly utilizing the original audio signal in the time domain as the network input data. Based on the experimental results evaluated on the real-world data set of passive sonar, our classification model achieves the overall accuracy of 98.85% at 0-dB signal-to-noise ratio (SNR) and outperforms traditional ML techniques, as well as other state-of-the-art CNN models. |
---|---|
ISSN: | 1545-598X 1558-0571 |
DOI: | 10.1109/LGRS.2020.3029584 |