Loading…

Automatic Counting Method for Centipedes Based on Deep Learning

The utilization of target detection algorithms for counting edible centipedes represents a novel endeavor in the field of traditional Chinese medicine materials using deep learning. However, the accuracy of current target detection algorithms is relatively low due to the complexity of the centipede...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2024, Vol.12, p.84726-84737
Main Authors: Yao, Jin, Chen, Weitao, Wang, Tao, Yang, Fu, Sun, Xiaoyan, Yao, Chong, Jia, Liangquan
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The utilization of target detection algorithms for counting edible centipedes represents a novel endeavor in the field of traditional Chinese medicine materials using deep learning. However, the accuracy of current target detection algorithms is relatively low due to the complexity of the centipede background and the density of the detection targets, making them poorly suited for practical application scenarios. To address this, this study proposes a centipede target detection algorithm based on an improved Your Only Look Once V5 (YOLOv5) model, termed FD-YOLO. This algorithm enhances the original model by incorporating the CBAM attention mechanism and the BiFormer universal visual transformer to suppress irrelevant information and intensify focus on the desired detection targets, thereby improving the precision of the algorithm. Additionally, the FD-YOLO algorithm enhances the model's generalizability and robustness by improving the existing SPPF module. Experimental results demonstrate that compared to the original YOLOv5 prototype network, the improved YOLOv5 model has increased the AP@0.5 by 3.5%, reaching 97.2%, and raised the recall rate from 86.2% to 92.9%. Therefore, the enhanced YOLOv5 algorithm can effectively detect and count centipedes, making it more suitable for current practical application scenarios.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3414114