Towards Real-world Event-guided Low-light Video Enhancement and Deblurring
– Published Date : TBD
– Category : Low-light Video Enhancement and Deblurring
– Place of publication : European Conference on Computer Vision (ECCV) 2024
Abstract:
In low-light conditions, capturing videos with frame-based cameras often requires long exposure times, resulting in motion blur and reduced visibility. While frame-based motion deblurring and low-light enhancement have been studied, they still pose significant challenges. Event cameras have emerged as a promising solution for improving image quality in low-light environments and addressing motion blur. They provide two key advantages: capturing scene details well even in low light due to their high dynamic range, and effectively capturing motion information during long exposures due to their high temporal resolution. Despite efforts to tackle low-light enhancement and motion deblurring using event cameras separately, previous work has not addressed both simultaneously. To explore the joint task, we first establish real-world datasets for event-guided low-light enhancement and deblurring using a hybrid camera system based on beam splitters. Subsequently, we introduce an end-to-end framework to effectively handle these tasks. Our framework incorporates an ED-TFA module to efficiently leverage temporal information from events and frames. We further propose the SFCM-FE module, which utilizes cross-modal feature information to employ a low-pass filter for noise suppression while enhancing the main structural information. Our proposed method significantly outperforms existing approaches in addressing the joint task. We plan to release both our RELED datasets and code to promote future research.