Loading…

Rapid and robust two-dimensional phase unwrapping via deep learning

Two-dimensional phase unwrapping algorithms are widely used in optical metrology and measurements. The high noise from interference measurements, however, often leads to the failure of conventional phase unwrapping algorithms. In this paper, we propose a deep convolutional neural network (DCNN) base...

Full description

Saved in:
Bibliographic Details
Published in:Optics express 2019-08, Vol.27 (16), p.23173-23185
Main Authors: Zhang, Teng, Jiang, Shaowei, Zhao, Zixin, Dixit, Krishna, Zhou, Xiaofei, Hou, Jia, Zhang, Yongbing, Yan, Chenggang
Format: Article
Language:English
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Two-dimensional phase unwrapping algorithms are widely used in optical metrology and measurements. The high noise from interference measurements, however, often leads to the failure of conventional phase unwrapping algorithms. In this paper, we propose a deep convolutional neural network (DCNN) based method to perform rapid and robust two-dimensional phase unwrapping. In our approach, we employ a DCNN architecture, DeepLabV3+, with noise suppression and strong feature representation capabilities. The employed DCNN is first used to perform semantic segmentation to obtain the segmentation result of the wrapped phase map. We then combine the wrapped phase map with the segmentation result to generate the unwrapped phase. We benchmarked our results by comparing them with well-established methods. The reported approach out-performed the conventional path-dependent and path-independent algorithms. We also tested the robustness of the reported approach using interference measurements from optical metrology setups. Our results, again, clearly out-performed the conventional phase unwrap algorithms. The reported approach may find applications in optical metrology and microscopy imaging.
ISSN:1094-4087
1094-4087
DOI:10.1364/OE.27.023173