Loading…

Tri-level optimization-based image rectification for polydioptric cameras

Recently, as the number of cameras built into polydioptric cameras has increased, image rectification of these cameras has become complicated. However, because conventional methods cannot compensate for the calibration errors or are limited by the camera arrangement, they cannot be widely applied to...

Full description

Saved in:
Bibliographic Details
Published in:Signal processing. Image communication 2020-09, Vol.87, p.115884, Article 115884
Main Authors: Lee, Siyeong, An, Gwon Hwan, Kim, Joonsoo, Yun, Kugjin, Cheong, Won-Sik, Kang, Suk-Ju
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Recently, as the number of cameras built into polydioptric cameras has increased, image rectification of these cameras has become complicated. However, because conventional methods cannot compensate for the calibration errors or are limited by the camera arrangement, they cannot be widely applied to various kinds of polydioptric cameras. In this work, we adopted the idea of disparity-error-minimization to overcome these limitations. We introduced the following several improvements in the optimization-based rectification. (1) We modified the objective function to include both the x and y disparity errors. (2) We added a regularization term to perform robustly for mismatched pairs. (3) We employed tri-level optimization to determine the camera pose corresponding to the rectified images. For two representative polydioptric cameras, this method reduced the average disparity error up to 66.57% compared to conventional methods. The results showed that our method exhibited significant generalization capabilities, achieving significant improvements over the existing methods. •An image-rectification method for polydioptric cameras is proposed.•It is based on an optimization that reduces the disparity error of correspondences.•The disparity error was reduced by 66.57% compared with the conventional methods.•Low x and y disparity errors were achieved in various types of images.
ISSN:0923-5965
1879-2677
DOI:10.1016/j.image.2020.115884