Loading…
EventFinder: a program for screening remotely captured images
Camera traps are becoming ubiquitous tools for ecologists. While easily deployed, they require human time to organize, review, and classify images including sequences of images of the same individual, and non-target images triggered by environmental conditions. For such cases, we developed an automa...
Saved in:
Published in: | Environmental monitoring and assessment 2019-06, Vol.191 (6), p.406-10, Article 406 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Camera traps are becoming ubiquitous tools for ecologists. While easily deployed, they require human time to organize, review, and classify images including sequences of images of the same individual, and non-target images triggered by environmental conditions. For such cases, we developed an automated computer program, named EventFinder, to reduce operator time by pre-processing and classifying images using background subtraction techniques and color histogram comparisons. We tested the accuracy of the program against images previously classified by a human operator. The automated classification, on average, reduced the data requiring human input by 90.8% with an accuracy of 96.1%, and produced a false positive rate of only 3.4%. Thus, EventFinder provides an efficient method for reducing the time for human operators to review and classify images making camera trap projects, which compile a large number of images, less costly to process. Our testing process used medium to large animals, but will also work with smaller animals, provided their images occupy a sufficient area of the frame. While our discussion focuses on camera trap image reduction, we also discuss how EventFinder might be used in conjunction with other software developments for managing camera trap data. |
---|---|
ISSN: | 0167-6369 1573-2959 |
DOI: | 10.1007/s10661-019-7518-9 |