Loading…
YOLOv4‐dense: A smaller and faster YOLOv4 for real‐time edge‐device based object detection in traffic scene
Edge‐device‐based object detection is crucial in many real‐world applications, such as self‐driving cars, ADAS, driver behavior analysis. Although deep learning (DL) has become the de‐facto approach for object detection, the limited computing resources of embedded devices and the large model size of...
Saved in:
Published in: | IET image processing 2023-02, Vol.17 (2), p.570-580 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Edge‐device‐based object detection is crucial in many real‐world applications, such as self‐driving cars, ADAS, driver behavior analysis. Although deep learning (DL) has become the de‐facto approach for object detection, the limited computing resources of embedded devices and the large model size of current DL‐based methods increase the difficulty of real‐time object detection on edge devices. To overcome these difficulties, in this work a novel YOLOv4‐dense model is proposed to detect objects in an accurate, fast manner, which is built on top of the YOLOv4 framework but with substantial improvements. More specifically, lots of CSP layers are pruned since it will decrease inference speed. And to address the losing small objects problem, a dense block is introduced. In addition, a lightweight two‐stream YOLO head is also designed to further reduce the computational complexity of the model. Experimental results on NVIDIA JETSON TX2 embedded platform demonstrate that YOLOv4‐dense can achieve a higher accuracy, faster speed with smaller model size. For instance, on the KITTI dataset, YOLOv4‐dense obtains 84.3% mAP and 22.6 FPS with only 20.3 M parameters, surpassing the state‐of‐the‐art models with comparable parameter budget such as YOLOv3‐tiny, YOLOv4‐tiny, PP‐YOLO‐tiny by a large margin. |
---|---|
ISSN: | 1751-9659 1751-9667 |
DOI: | 10.1049/ipr2.12656 |