Loading…

Measurement of the length of pedestrian crossings and detection of traffic lights from image data

This paper address the application of computer vision as a travel aid for the blind. Usually the blind use a white cane as a travel aid. The range of detection of obstacles using a cane is very narrow. We have aimed to develop a device with which the blind would be able to autonomously detect import...

Full description

Saved in:
Bibliographic Details
Published in:Measurement science & technology 2002-09, Vol.13 (9), p.1450-1457, Article 311
Main Authors: Shioyama, Tadayoshi, Wu, Haiyuan, Nakamura, Naoki, Kitawaki, Suguru
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper address the application of computer vision as a travel aid for the blind. Usually the blind use a white cane as a travel aid. The range of detection of obstacles using a cane is very narrow. We have aimed to develop a device with which the blind would be able to autonomously detect important information for safely negotiating a crossing. This paper proposes a method for measurement of the length of a pedestrian crossing and for the detection of traffic lights from image data observed with a single camera. The length of a crossing is measured from image data of white lines painted on the road at a crossing by using projective geometry. Furthermore, the state of the traffic lights, green (go signal) or red (stop signal), is detected by extracting candidates for the traffic light region with colour similarity and selecting a true traffic light from them using affine moment invariants. From the experimental results, the length of a crossing is measured with an accuracy such that the maximum relative error of measured length is less than 5% and the rms error is 0.38 m. A traffic light is efficiently detected by selecting a true traffic light region with an affine moment invariant.
ISSN:0957-0233
1361-6501
DOI:10.1088/0957-0233/13/9/311