Loading…
Multi-sensor background subtraction by fusing multiple region-based probabilistic classifiers
•We use RGB-D cameras data for foreground/background segmentation.•Pixel level and region level background models based on color and depth data.•Foreground region prediction, based on depth based histograms.•Fusion of region based classifiers as mixture of experts. In the recent years, the computer...
Saved in:
Published in: | Pattern recognition letters 2014-12, Vol.50, p.23-33 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •We use RGB-D cameras data for foreground/background segmentation.•Pixel level and region level background models based on color and depth data.•Foreground region prediction, based on depth based histograms.•Fusion of region based classifiers as mixture of experts.
In the recent years, the computer vision community has shown great interest on depth-based applications thanks to the performance and flexibility of the new generation of RGB-D imagery. In this paper, we present an efficient background subtraction algorithm based on the fusion of multiple region-based classifiers that processes depth and color data provided by RGB-D cameras. Foreground objects are detected by combining a region-based foreground prediction (based on depth data) with different background models (based on a Mixture of Gaussian algorithm) providing color and depth descriptions of the scene at pixel and region level. The information given by these modules is fused in a mixture of experts fashion to improve the foreground detection accuracy. The main contributions of the paper are the region-based models of both background and foreground, built from the depth and color data. The obtained results using different database sequences demonstrate that the proposed approach leads to a higher detection accuracy with respect to existing state-of-the-art techniques. |
---|---|
ISSN: | 0167-8655 1872-7344 |
DOI: | 10.1016/j.patrec.2013.09.022 |