Loading…

Support vector regression and extended nearest neighbor for video object retrieval

Video retrieval is one of the emerging areas in video capturing that gained various technical advances, increasing the availability of a huge mass of videos. For the text or the image query given, retrieving the relevant videos and the objects from the videos is not always an easy task. A hybrid mod...

Full description

Saved in:
Bibliographic Details
Published in:Evolutionary intelligence 2022-06, Vol.15 (2), p.837-850
Main Authors: Ghuge, C. A., Ruikar, Sachin D., Prakash, V. Chandra
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Video retrieval is one of the emerging areas in video capturing that gained various technical advances, increasing the availability of a huge mass of videos. For the text or the image query given, retrieving the relevant videos and the objects from the videos is not always an easy task. A hybrid model was developed in the previous work using the Nearest Search Algorithm (NSA) and exponential weighted moving average (EWMA), for the video object retrieval. In NSA + EWMA, the object trajectories are retrieved based on the query specific distance. This work extends the previous work by developing a novel path equalization scheme for equalizing the path length of the query and the tracked object. Initially, a hybrid model based on Support Vector Regression and NSA tracks the position of the object in the video. The proposed density measure scheme equalizes the path length of the query and the object. Then, the identified path length related to the query is given to extended nearest neighbor classifier for retrieving the video. From the simulation results, it is evident that the proposed video retrieval scheme achieved high values of 0.901, 0.860, 0.849, and 0.922 for precision, recall, F-measure, and multiple object tracking precision, respectively.
ISSN:1864-5909
1864-5917
DOI:10.1007/s12065-018-0176-y