Loading…

Video-based salient object detection using hybrid optimisation strategy and contourlet mapping

The advancements in salient object detection have attracted many researchers and are significant in several computer vision applications. However, efficient salient object detection using still images is a major challenge. This paper proposes salient object detection technique using the proposed Spi...

Full description

Saved in:
Bibliographic Details
Published in:International journal of image and data fusion 2020-04, Vol.11 (2), p.162-184
Main Authors: A., Saju, Suresh, H. N.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The advancements in salient object detection have attracted many researchers and are significant in several computer vision applications. However, efficient salient object detection using still images is a major challenge. This paper proposes salient object detection technique using the proposed Spider-Gray Wolf Optimiser (S-GWO) algorithm that is designed by combining Gray Wolf Optimiser (GWO) and Spider Monkey Optimisation (SMO). The technique undergoes three steps, which involves keyframe extraction, saliency mapping, contourlet mapping, and then, fusion of obtained outputs using optimal coefficients. Initially, the extracted frames are subjected to saliency mapping and contourlet mapping simultaneously in order to determine the quality of each pixel. Then, the outputs obtained from the saliency mapping and contourlet mapping is fused using the arbitrary coefficients for obtaining the final result that is employed for detecting the salient objects. Here, the proposed S-GWO is employed for selecting the optimal coefficients for detecting the salient objects. The experimental evaluation of the proposed S-GWO based on the performance metrics reveals that the proposed S-GWO attained a maximal accuracy, sensitivity and specificity with 0.914, 0.861 and 0.929, respectively.
ISSN:1947-9832
1947-9824
DOI:10.1080/19479832.2019.1683625