Loading…
Underwater acoustic detection and classification for cetaceans' vocalizations of Marine Observatory in the Northeastern Taiwan (MONET)
Passive acoustics is as an important tool for observing marine animals and long-term underwater environment monitoring. Since the amount of data is enormous, an effective auto-detector to select critical features and classify their patterns from the recorded acoustic signal. In this study, we had de...
Saved in:
Published in: | The Journal of the Acoustical Society of America 2012-04, Vol.131 (4_Supplement), p.3494-3494 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Passive acoustics is as an important tool for observing marine animals and long-term underwater environment monitoring. Since the amount of data is enormous, an effective auto-detector to select critical features and classify their patterns from the recorded acoustic signal. In this study, we had developed an automatic detector with both the feature extraction and classification modules. In the feature extraction module, we select features from the energy and end-point of the time signal in needed. Then, we normalized the extracted features as inputs for the classification module based on the theory of back propagation neural network (BPNN). The BPNN will be trained and tested using both the cetaceans' acoustic signals which recorded from the hydrophone of Marine Cable Hosted Observatory (MACHO system) until the network becomes stable and convergent. Identification objects are chosen commonly seeing cetaceans from the northeastern offshore of Taiwan, Guishan Island. The detector is a robust tool which has good recognition rate for classifying cetaceans. It supersedes the experienced human operators due to less time consuming and low labor cost. (sponsored by National Science Council of Republic of China under project “Marine Observatory in the Northeastern Taiwan (MONET)” No. NSC 100-2221-E-002-027) |
---|---|
ISSN: | 0001-4966 1520-8524 |
DOI: | 10.1121/1.4709202 |