Loading…

Robust feature point matching by preserving local geometric consistency

We present a method for matching feature points robustly across widely separated images. In general, it is difficult to match feature points correctly by using only the similarity between local descriptors. In our approach, the correspondence problem is formulated as an optimization problem with one...

Full description

Saved in:
Bibliographic Details
Published in:Computer vision and image understanding 2009-06, Vol.113 (6), p.726-742
Main Authors: Choi, Ouk, Kweon, In So
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We present a method for matching feature points robustly across widely separated images. In general, it is difficult to match feature points correctly by using only the similarity between local descriptors. In our approach, the correspondence problem is formulated as an optimization problem with one-to-one correspondence constraints. A novel objective function is defined to preserve local image-to-image affine transformations across correspondences. This objective function enables our method to cope with significant viewpoint or scale changes between images, unlike previous methods that relied on the assumption that the distance or orientation between neighboring feature points are preserved across images. A relaxation algorithm is proposed for maximizing the objective function, which imposes one-to-one correspondence constraints, unlike conventional relaxation labeling algorithms that impose many-to-one correspondence constraints. Experimental evaluation shows that our method is robust with respect to significant viewpoint changes, scale changes, and nonrigid deformations between images, in the presence of repeated textures that make feature point matching more ambiguous. Our method is also applied to object recognition in cluttered environments, giving some promising results.
ISSN:1077-3142
1090-235X
DOI:10.1016/j.cviu.2008.12.002