Loading…

Pedestrian lane detection in unstructured scenes for assistive navigation

•We propose an algorithm for pedestrian lane detection in unstructured scenes.•Vanishing point is found via color tensor and local orientations of edge pixels.•Sample lane region is then located via color and geometrics features of lane.•A lane model adaptive to the input image is constructed for la...

Full description

Saved in:
Bibliographic Details
Published in:Computer vision and image understanding 2016-08, Vol.149, p.186-196
Main Authors: Phung, Son Lam, Le, Manh Cuong, Bouzerdoum, Abdesselam
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•We propose an algorithm for pedestrian lane detection in unstructured scenes.•Vanishing point is found via color tensor and local orientations of edge pixels.•Sample lane region is then located via color and geometrics features of lane.•A lane model adaptive to the input image is constructed for lane segmentation.•We create a dataset for vanishing point estimation and pedestrian lane detection. Automatic detection of the pedestrian lane in a scene is an important task in assistive and autonomous navigation. This paper presents a vision-based algorithm for pedestrian lane detection in unstructured scenes, where lanes vary significantly in color, texture, and shape and are not indicated by any painted markers. In the proposed method, a lane appearance model is constructed adaptively from a sample image region, which is identified automatically from the image vanishing point. This paper also introduces a fast and robust vanishing point estimation method based on the color tensor and dominant orientations of color edge pixels. The proposed pedestrian lane detection method is evaluated on a new benchmark dataset that contains images from various indoor and outdoor scenes with different types of unmarked lanes. Experimental results are presented which demonstrate its efficiency and robustness in comparison with several existing methods.
ISSN:1077-3142
1090-235X
DOI:10.1016/j.cviu.2016.01.011