Loading…
Motion estimation in hazy videos
•A novel end-to-end deep network is proposed for haze removal from hazy video frames.•Failure of conventional approaches in estimating motion information in hazy videos has been analyzed.•A novel haze removal → motion estimation approach is proposed for motion estimation from hazy videos. Motion est...
Saved in:
Published in: | Pattern recognition letters 2021-10, Vol.150, p.130-138 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •A novel end-to-end deep network is proposed for haze removal from hazy video frames.•Failure of conventional approaches in estimating motion information in hazy videos has been analyzed.•A novel haze removal → motion estimation approach is proposed for motion estimation from hazy videos.
Motion estimation is the basic need for the success of many video analysis algorithms such as moving object detection, human activity recognition, etc. Most of the motion estimation algorithms are prone to weather conditions and thus, they fail to estimate the motion in degraded weather. Severe weather situations like snow, rain, haze, smog, etc., degrades the performance and reliability of video analysis algorithms. In this paper, we have analyzed the effect of the haze on motion estimation in hazy videos. We propose a cascaded architecture i.e. haze removal followed by optical flow for motion estimation in hazy videos. The proposed image de-hazing network is build upon the Residual and Inception module concepts and named as ResINet. Further, an optical flow is utilized to estimate the motion information. We have carried out the visual analysis to validate the proposed approach for motion estimation in hazy videos. Also, to validate the proposed ResINet for de-hazing, we carried out the quantitative analysis on two benchmark image de-hazing datasets. |
---|---|
ISSN: | 0167-8655 1872-7344 |
DOI: | 10.1016/j.patrec.2021.06.029 |