Loading…

Robust Light Field Depth Estimation Using Occlusion-Noise Aware Data Costs

Depth estimation is essential in many light field applications. Numerous algorithms have been developed using a range of light field properties. However, conventional data costs fail when handling noisy scenes in which occlusion is present. To address this problem, we introduce a light field depth e...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on pattern analysis and machine intelligence 2018-10, Vol.40 (10), p.2484-2497
Main Authors: Williem, Park, In Kyu, Lee, Kyoung Mu
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Depth estimation is essential in many light field applications. Numerous algorithms have been developed using a range of light field properties. However, conventional data costs fail when handling noisy scenes in which occlusion is present. To address this problem, we introduce a light field depth estimation method that is more robust against occlusion and less sensitive to noise. Two novel data costs are proposed, which are measured using the angular patch and refocus image, respectively. The constrained angular entropy cost (CAE) reduces the effects of the dominant occluder and noise in the angular patch, resulting in a low cost. The constrained adaptive defocus cost (CAD) provides a low cost in the occlusion region, while also maintaining robustness against noise. Integrating the two data costs is shown to significantly improve the occlusion and noise invariant capability. Cost volume filtering and graph cut optimization are applied to improve the accuracy of the depth map. Our experimental results confirm the robustness of the proposed method and demonstrate its ability to produce high-quality depth maps from a range of scenes. The proposed method outperforms other state-of-the-art light field depth estimation methods in both qualitative and quantitative evaluations.
ISSN:0162-8828
1939-3539
2160-9292
DOI:10.1109/TPAMI.2017.2746858