Loading…

EDHnet: An Event-Based Dual-Stream Hybrid Network for Image Motion Deblurring

The exposure time causes inevitable motion blurring in the traditional frame-based cameras' imaging. In contrast, event cameras, inspired by biology, are renowned for their high temporal resolution, which can capture motion information in microseconds (even nanoseconds), thus offering a promisi...

Full description

Saved in:
Bibliographic Details
Published in:IEEE sensors journal 2024-10, Vol.24 (20), p.32884-32897
Main Authors: Gong, Yuanhao, Lin, Zewei
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The exposure time causes inevitable motion blurring in the traditional frame-based cameras' imaging. In contrast, event cameras, inspired by biology, are renowned for their high temporal resolution, which can capture motion information in microseconds (even nanoseconds), thus offering a promising new approach to the motion-deblurring task. In this article, we introduce a novel event-based dual-stream hybrid deblurring network (EDHNet) for motion deblurring. It divides the deblurring task into two sub-tasks for severe and mild motion blurs, corresponding to event stream data with different spatio-temporal densities. Our network employs severe-deblurring network (SDNet) and mild-deblurring network (MDNet) to address severe and mild motion blurs, respectively, and integrates a fusion module to generate the final sharp images from the combined output features of these two branches. To tackle the severe blur and dense event stream, spiking neural networks and convolutional neural networks (CNNs) are adopted to make a feature extraction from the event stream and employ a dual cross-attention module to fuse the features from event stream data and images. For the mild blurs, CNNs are adopted, where a residual fusion layer is proposed to effectively leverage sparse event data. The experimental results confirm that our method establishes a new state-of-the-art on GoPro dataset for motion deblurring and demonstrates exceptional generalization capabilities in real-world blur scenarios.
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2024.3454153