Loading…

Ground target localization and recognition via descriptors fusion

Keypoint matching can be defined as locating the position of a particular point in two images precisely. Recently, keypoint descriptors have taken a great effect targeting to be powerfully invariant to rotation, scale and translation for improving target detection. The detection task is carried out...

Full description

Saved in:
Bibliographic Details
Published in:IOP conference series. Materials Science and Engineering 2019-09, Vol.610 (1), p.12015
Main Authors: Kamel, Mohamed M, Taha, Hussein S, Salama, Gouda I, Elhalwagy, Yehia Z
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Keypoint matching can be defined as locating the position of a particular point in two images precisely. Recently, keypoint descriptors have taken a great effect targeting to be powerfully invariant to rotation, scale and translation for improving target detection. The detection task is carried out by a reference-scene image matching to localize the desired target in the input scene. An innovative approach is proposed in this work to fuse the state-of-the-art feature descriptors ORB, BRISK for the sake of accurate ground target detection in two phases. Firstly, off-line phase, where the fused features are extracted from different perspective, azimuth angles of the desired target to build a comprehensive reference image representation. Secondly, on-line phase, where the fused features extraction task is carried out from the whole scene. Hence, it is matched with the stored reference one to find the keypoints correspondence. The outliers' problem is eliminated using Random Sample Consensus (RANSAC) algorithm resulting in speeding up the matching procedure. The conducted comparative analysis has revealed the discriminative power of the fused features in localization and recognition tasks while keeping the proposed system works in real-time.
ISSN:1757-8981
1757-899X
DOI:10.1088/1757-899X/610/1/012015