Loading…

EVS-Assisted Joint Deblurring, Rolling-Shutter Correction and Video Frame Interpolation Through Sensor Inverse Modeling

Event-based Vision Sensors (EVS) gain popularity in en-hancing CMOS Image Sensor (CIS) video capture. Nonide-alities of EVS such as pixel or readout latency can significantly influence the quality of the enhanced images and warrant dedicated consideration in the design of fusion algorithms. A novel...

Full description

Saved in:
Bibliographic Details
Main Authors: Jiang, Rui, Tu, Fangwen, Long, Yixuan, Vaish, Aabhaas, Zhou, Bowen, Wang, Qinyi, Zhang, Wei, Fang, Yuntan, Garcia Capel, Luis Eduardo, Mu, Bo, Dai, Tiejun, Suess, Andreas
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Event-based Vision Sensors (EVS) gain popularity in en-hancing CMOS Image Sensor (CIS) video capture. Nonide-alities of EVS such as pixel or readout latency can significantly influence the quality of the enhanced images and warrant dedicated consideration in the design of fusion algorithms. A novel approach for jointly computing de-blurred, rolling-shutter artifact corrected high-speed videos with frame rates up to 10000 FPS using inherently blurry rolling shutter CIS frames of 120 FPS to 150 FPS in conjunction with EVS data from a hybrid CIS-EVS sensor is presented. EVS pixel latency, readout latency and the sensor's refractory period are explicitly incorporated into the measurement model. This inverse function problem is solved on a per-pixel manner using an optimization-based framework. The interpolated images are subsequently processed by a novel refinement network. The proposed method is evaluated using simulated and measured datasets, under natural and controlled environments. Extensive experiments show reduced shadowing effect, a 4 dB increment in PSNR, and a 12 % improvement in LPIPS score compared to state-of-the-art methods.
ISSN:2575-7075
DOI:10.1109/CVPR52733.2024.02378