Loading…

Three-dimensional Shape Reconstruction from Single-shot Speckle Image Using Deep Convolutional Neural Networks

•3D shapes can be reconstructed from a single-shot image•Structured light technique and convolutional neural networks are combined together•Three convolutional neural networks are proposed•Fringe pattern works better than speckle image for 3D shape reconstruction•The speed is much faster than conven...

Full description

Saved in:
Bibliographic Details
Published in:Optics and lasers in engineering 2021-08, Vol.143, p.106639, Article 106639
Main Authors: Nguyen, Hieu, Tran, Tan, Wang, Yuzeng, Wang, Zhaoyang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•3D shapes can be reconstructed from a single-shot image•Structured light technique and convolutional neural networks are combined together•Three convolutional neural networks are proposed•Fringe pattern works better than speckle image for 3D shape reconstruction•The speed is much faster than conventional 3D shape reconstruction techniques Three-dimensional (3D) shape reconstruction from a monocular two-dimensional (2D) image has emerged as a highly demanded tool in many applications. This paper presents a novel 3D shape reconstruction technique that employs an end-to-end deep convolutional neural network (CNN) to transform a single speckle-pattern image into its corresponding 3D point cloud. In the proposed approach, three CNN models are explored for comparison to find the best capable network. To train the models with reliable datasets in the learning process, a multi-frequency fringe projection profilometry technique is adopted to prepare high-accuracy ground-truth 3D labels. Unlike the conventional 3D imaging and shape reconstruction techniques which often involve complicated algorithms and intensive computation, the proposed technique is simple, yet very fast and robust. A few experiments have been conducted to assess and validate the proposed approach, and its capability provides promising solutions to the ever-increasing scientific research and engineering applications.
ISSN:0143-8166
1873-0302
DOI:10.1016/j.optlaseng.2021.106639