Loading…
Detection and performance analysis of vulnerable road users in low light conditions using YOLO
Vulnerable road users (VRUs) detection is a challenging task for autonomous vehicles and advanced driver assistance systems which aims to enhance the protection of individuals, such as pedestrians, cyclists and motorcyclists, who are more susceptible to accidents on roadways. The significance of det...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Vulnerable road users (VRUs) detection is a challenging task for autonomous vehicles and advanced driver assistance systems which aims to enhance the protection of individuals, such as pedestrians, cyclists and motorcyclists, who are more susceptible to accidents on roadways. The significance of detecting VRUs becomes even more critical under adverse lighting conditions due to poor illumination, bad weather and other challenging circumstances. To maintain the highest level of road safety, these detection procedures must be accurate and reliable. Traditional methods rely on hand-crafted feature extraction from images, while deep learning approaches have demonstrated superior performance in VRUs detection by automatically learning hierarchical features from large datasets of images. Despite significant progress, real-time implementation of VRUs detection under low light remains a challenging task in finding a balance between detection speed and accuracy. Various techniques have been employed for VRU detection, ranging from traditional image processing methods to deep learning approaches. In this paper, a study is conducted to analyze the performance of YOLOv5m, YOLOv7 and YOLOv8m in low light condition. The experimental results on the ExDark dataset reveals that YOLOv7 performs well with a balance between speed and accuracy. The performance indices of YOLOv7 with respect to precision, recall and mean average precision (mAP) are 0.88, 0.746 and 0.827, respectively. |
---|---|
ISSN: | 2836-1873 |
DOI: | 10.1109/ICCSP60870.2024.10544095 |