Loading…

A deep convolutional neural network based classifier for passive acoustic monitoring of neotropical katydids

Insects occupy a central position in terrestrial trophic webs, consuming large amounts of vegetation and being consumed by other fauna. Changes in insect communities can have widespread effects across an ecosystem. In addition, having relatively small dispersal distances and stereotyped calls, insec...

Full description

Saved in:
Bibliographic Details
Published in:The Journal of the Acoustical Society of America 2019-10, Vol.146 (4), p.2982-2982
Main Authors: Madhusudhana, Shyam Kumar, Symes, Laurel B., Klinck, Holger
Format: Article
Language:English
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Insects occupy a central position in terrestrial trophic webs, consuming large amounts of vegetation and being consumed by other fauna. Changes in insect communities can have widespread effects across an ecosystem. In addition, having relatively small dispersal distances and stereotyped calls, insects make for an excellent choice as a monitoring target. Here, passive acoustic monitoring is employed for studying katydids, a relative of crickets and grasshoppers that are often charismatic mimics of leaves and sticks. In a prior study, call repertoires of over 60 species of katydids were established using individuals captured on Barro Colorado Island, Panama. These recordings are used in training a deep convolutional neural network based classifier, for subsequent analyses of long-term field recordings that are currently being collected. Coarse imbalances in the number of captured individuals per species and in their calling rates resulted in an unbalanced dataset. Furthermore, differences in recording equipment and in the recording practices posed challenges to achieving a well-rounded dataset. The measures taken to address these impediments will be discussed along with the classifier design choices. The classifier yielded >98% mean-average-precision over 60 classes on an exclusive validation split. Results of its application on field recordings will be presented.
ISSN:0001-4966
1520-8524
DOI:10.1121/1.5137323