Loading…

Disparity Refinement with Guided Filtering of Soft 3D Cost Function in Multi-view Stereo System

In multi-view stereo systems, occlusions occurs in various viewing directions. In occlusion image areas, disparity estimation is generally inaccurate because the matching cost computation is incorrect. Therefore, correction or refinement of disparity values in the occlusion area is an important issu...

Full description

Saved in:
Bibliographic Details
Main Authors: Lee, Min-Jae, Um, Gi-Mun, Yun, Joungil, Cheong, Won-Sik, Park, Soon-Yong
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In multi-view stereo systems, occlusions occurs in various viewing directions. In occlusion image areas, disparity estimation is generally inaccurate because the matching cost computation is incorrect. Therefore, correction or refinement of disparity values in the occlusion area is an important issue in the stereo vision study. The soft 3D reconstruction method, recently introduced by Google, refines inaccurate disparity values in the occlusion areas by using the probability of visibility (PV) in every image pixels. The probability of visibility is computed using initial disparity maps of a multi-view stereo system. Then, the probability is refined using a guide filter. The guide of the filter is the reference color image. However, the color image can include noise due to the viewing direction of the reference camera, light reflection, etc. Therefore, the probability is affected by the image noise. In this paper, we propose a disparity refinement method to enhance the performance of the original soft 3D reconstruction by adopting bilaterally filtered color images as the guide image. The bilateral filter preserves image edge while color noise are minimized by Gaussian smoothing. The filtered color image is used as the guide filter when computing a 3D probability volume of visibility in the soft 3D reconstruction. In experiments, we reconstruct 3D point cloud with the refined disparity maps.
ISSN:2151-2205
DOI:10.1109/IVCNZ48456.2019.8961015