Loading [MathJax]/extensions/MathMenu.js
Unsupervised Restoration of Weather-affected Images using Deep Gaussian Process-based CycleGAN | IEEE Conference Publication | IEEE Xplore

Unsupervised Restoration of Weather-affected Images using Deep Gaussian Process-based CycleGAN


Abstract:

Existing approaches for restoring weather-degraded images follow a fully-supervised paradigm and they require paired data for training. However, collecting paired data fo...Show More

Abstract:

Existing approaches for restoring weather-degraded images follow a fully-supervised paradigm and they require paired data for training. However, collecting paired data for weather degradations is extremely challenging, and existing methods end up training on synthetic data. To overcome this issue, we describe an approach for supervising deep networks that is based on CycleGAN, thereby enabling the use of unlabeled real-world data for training. Specifically, we introduce new losses for training CycleGAN that lead to more effective training, resulting in high quality reconstructions. These new losses are obtained by jointly modeling the latent space embeddings of predicted clean images and original clean images through Deep Gaussian Processes. This enables the CycleGAN architecture to transfer the knowledge from one domain (weather-degraded) to another (clean) more effectively. We demonstrate that the proposed method can be effectively applied to different restoration tasks like de-raining, de-hazing and de-snowing and it outperforms other unsupervised techniques (that leverage weather-based characteristics) by a considerable margin.
Date of Conference: 21-25 August 2022
Date Added to IEEE Xplore: 29 November 2022
ISBN Information:

ISSN Information:

Conference Location: Montreal, QC, Canada

I. Introduction

Weather conditions such as rain, fog (haze) and snow are aberrations in the environment that adversely affect the light rays traveling from the object to a visual sensor [1], [2], [3], [4], [5], [6]. This typically causes detrimental effects on the images captured by the sensors, resulting in poor aesthetic quality. Additionally, such images also reduce the performance of down-stream computer vision tasks such as detection and recognition [7]. Such tasks are often critical parts in autonomous navigation systems, which emphasizes the need to address these degradations. These reasons has motivated a plethora of research on methods to remove such effects.

Contact IEEE to Subscribe

References

References is not available for this document.