Loading…

Illumination compensation based change detection using order consistency

We present a change detection method resistant to global and local illumination variations for use in visual surveillance scenarios. Approaches designed thus far for robustness to illumination change are generally based either on color normalization, texture (e.g. edges, rank order statistics, etc.)...

Full description

Saved in:
Bibliographic Details
Main Authors: Parameswaran, Vasu, Singh, Maneesh, Ramesh, Visvanathan
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We present a change detection method resistant to global and local illumination variations for use in visual surveillance scenarios. Approaches designed thus far for robustness to illumination change are generally based either on color normalization, texture (e.g. edges, rank order statistics, etc.), or illumination compensation. Normalization based methods sacrifice discriminability while texture based methods cannot operate on texture-less regions. Both types of method can produce large missing regions in the distance image which in turn pose problems for higher-level processing tasks that may be shape or region-based and require accurate foreground masks (e.g. person detection and tracking, crowd segmentation, etc.). Texture based methods have an additional problem in that they produce false alarms due to textures induced by local illumination effects (e.g. cast shadows). In this paper we propose a compensation based approach for change detection. Prior work on compensation has largely taken an empirical approach, and has not dealt with the important problem of rejecting outliers when they dominate the scene. In contrast, our generative approach and systematic handling of outliers enables us to achieve robustness to illumination change while eliminating the problems mentioned above. Furthermore, the computational complexity of our method is low enough for real-time performance. Results comparing images taken under strongly different illumination conditions, demonstrate the power and generality of the proposed method.
ISSN:1063-6919
DOI:10.1109/CVPR.2010.5539873