Loading…

Traffic signs detection and recognition systems by light-weight multi-stage network

Traffic sign detection and recognition (TSDR) plays an important role in the fields for assistant driving, autonomous vehicle and so on. However, due to the complexity of real driving scene and variety of traffic signs, many challenging problems occurred, such as inaccurate color segmentation and ti...

Full description

Saved in:
Bibliographic Details
Published in:Multimedia tools and applications 2022-05, Vol.81 (12), p.16155-16169
Main Authors: Hou, Mingzheng, Zhang, Xin, Chen, Yang, Dong, Penglin, Feng, Ziliang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Traffic sign detection and recognition (TSDR) plays an important role in the fields for assistant driving, autonomous vehicle and so on. However, due to the complexity of real driving scene and variety of traffic signs, many challenging problems occurred, such as inaccurate color segmentation and time consumption of recognition algorithm based on deep learning. This paper describes an approach for efficiently detection and recognizing in real world scenarios. First of all, a traffic sign region of interest extraction algorithm based on multi-color space is proposed, the fusion future of HSV and RGB color space can obtain better color segmentation for the SVM classifier. Next a novel multi-scale two-stage lightweight network (MSTSN) is investigated, which adopts a coarse-to-fine strategy to improve recognition accuracy. Specially, the candidate Region of Interests (ROIs) are fed into a binary classification layer and only positive ones are further classified with multi-class classification network. The deeply separable convolution, residual structure and feature enhancement module is the bottleneck of MSTSN, which obtains more discriminative features and meets requirement for real-time performance. The experimental results successfully demonstrate effectiveness of our method.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-022-12201-x