Loading…

Underwater image matching by incorporating structural constraints

Underwater robot plays an important role in underwater perception and manipulation tasks. Vision information processing is essential for the intelligent perception of an underwater robot, in which image matching is a fundamental topic. Feature-based image matching is suitable for the underwater envi...

Full description

Saved in:
Bibliographic Details
Published in:International journal of advanced robotic systems 2017-12, Vol.14 (6), p.172988141773810
Main Authors: Yang, Xu, Liu, Zhi-Yong, Qiao, Hong, Song, Yong-Bo, Ren, Shu-Nan, Ji, Da-Xiong, Zheng, Sui-Wu
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Underwater robot plays an important role in underwater perception and manipulation tasks. Vision information processing is essential for the intelligent perception of an underwater robot, in which image matching is a fundamental topic. Feature-based image matching is suitable for the underwater environment. However, current underwater image matching usually directly applies those methods with a general purpose or designed for images obtained from the land to underwater images. The problem is that the blurring appearance caused feature descriptor ambiguity, which may greatly deteriorate the performance of these methods on underwater images. Aiming at problem, this article provides an underwater image matching framework by incorporating structural constraints. By integrating the appearance descriptor and structural information by a graph model, the feature correspondence-based image matching is formulated and solved by a graph matching method. Particularly, to solve the outlier feature problem, the graph matching method is applicable to the case where outlier features exist in both underwater images. Experiments on both synthetic points and real-world underwater images validate the effectiveness of the proposed method.
ISSN:1729-8806
1729-8814
1729-8814
DOI:10.1177/1729881417738100