Loading [MathJax]/jax/output/HTML-CSS/autoload/mtable.js
Physically Disentangled Intra- and Inter-domain Adaptation for Varicolored Haze Removal | IEEE Conference Publication | IEEE Xplore

Physically Disentangled Intra- and Inter-domain Adaptation for Varicolored Haze Removal


Abstract:

Learning-based image dehazing methods have achieved marvelous progress during the past few years. On one hand, most approaches heavily rely on synthetic data and may face...Show More

Abstract:

Learning-based image dehazing methods have achieved marvelous progress during the past few years. On one hand, most approaches heavily rely on synthetic data and may face difficulties to generalize well in real scenes, due to the huge domain gap between synthetic and real images. On the other hand, very few works have considered the varicolored haze, caused by chromatic casts in real scenes. In this work, our goal is to handle the new task: real-world varicolored haze removal. To this end, we propose a physically disentangled joint intra- and inter-domain adaptation paradigm, in which intra-domain adaptation focuses on color correction and inter-domain procedure transfers knowledge between synthetic and real domains. We first learn to physically disentangle haze images into three components complying with the scattering model: background, transmission map, and atmospheric light. Since haze color is determined by atmospheric light, we perform intra-domain adaptation by specifically translating atmospheric light from varicolored space to unified color-balanced space, and then reconstructing color-balanced haze image through the scattering model. Consequently, we perform inter-domain adaptation between the synthetic and real images by mutually exchanging the background and other two components. Then we can reconstruct both identity and domain-translated haze images with self-consistency and adversarial loss. Extensive experiments demonstrate the superiority of the proposed method over the state-of-the-art for real varicolored image dehazing.
Date of Conference: 18-24 June 2022
Date Added to IEEE Xplore: 27 September 2022
ISBN Information:

ISSN Information:

Conference Location: New Orleans, LA, USA

Funding Agency:


1. Introduction

Haze, as a common weather phenomenon, would result in low contrast and severe visibility degradation, which not only leads to poor visual quality but also does serious harm to high-level vision tasks, such as scene classification [35], object detection [25] and semantic segmentation [41]. The haze procedure can be mathematically formulated via the well-known atmosphere scattering model [29], [34]: \begin{equation*} I(x)=J(x)t(x)+A(1-t(x)),\tag{1}\end{equation*}

where is observed haze image, and is haze-free background to be restored. and denote atmospheric light and transmission map, and and represent scattering coefficient and depth respectively. The goal of dehazing is to estimate from hazy input .

The visual examples of dehazing results for real-world varicolored haze images. The second and third column show the results of da-dehazing [42] and proposed method, respectively.

Contact IEEE to Subscribe

References

References is not available for this document.