Loading [MathJax]/extensions/MathMenu.js
GC-VTON: Predicting Globally Consistent and Occlusion Aware Local Flows with Neighborhood Integrity Preservation for Virtual Try-on | IEEE Conference Publication | IEEE Xplore

GC-VTON: Predicting Globally Consistent and Occlusion Aware Local Flows with Neighborhood Integrity Preservation for Virtual Try-on


Abstract:

Flow based garment warping is an integral part of image-based virtual try-on networks. However, optimizing a single flow predicting network for simultaneous global bounda...Show More

Abstract:

Flow based garment warping is an integral part of image-based virtual try-on networks. However, optimizing a single flow predicting network for simultaneous global boundary alignment and local texture preservation results in sub-optimal flow fields. Moreover, dense flows are inherently not suited to handle intricate conditions like garment occlusion by body parts or by other garments. Forcing flows to handle the above issues results in various distortions like texture squeezing, and stretching. In this work, we propose a novel approach where we disentangle the global boundary alignment and local texture preserving tasks via our GlobalNet and LocalNet modules. A consistency loss is then employed between the two modules which harmonizes the local flows with the global boundary alignment. Additionally, we explicitly handle occlusions by predicting body-parts visibility mask, which is used to mask out the occluded regions in the warped garment. The masking prevents the LocalNet from predicting flows that distort texture to compensate for occlusions. We also introduce a novel regularization loss (NIPR), that defines a criteria to identify the regions in the warped garment where texture integrity is violated (squeezed or stretched). NIPR subsequently penalizes the flow in those regions to ensure regular and coherent warps that preserve the texture in local neighborhoods. Evaluation on a widely used virtual try-on dataset demonstrates strong performance of our network compared to the current SOTA methods.
Date of Conference: 03-08 January 2024
Date Added to IEEE Xplore: 09 April 2024
ISBN Information:

ISSN Information:

Conference Location: Waikoloa, HI, USA

1. Introduction

Image-based virtual try-on aims at generating natural, distortion and artifacts-free images of a person wearing a selected garment. Image synthesis via GANs [7] has been widely used in applications like image editing [16], [23], [26], style-transfer [2], [8], [33] and image generation [13], [30], [34]. However, simply using synthesis methods that holistically change the image does not result in the desired quality in virtual try-on setting. Existing methods adopt a scheme where the garment is first warped to meet the target person pose requirements. A GAN based generator network then fuses the warped garment and the target person images to generate a final try-on image. Traditionally, the warping is either done by a Thin Plate Spline (TPS) warp [3, 5, 10, 12, 15, 25, 29], or a dense flow fields based warp [1], [4], [9], [11], [14], or a combination of both [28]. In any case, the warping is inherently not capable of modeling all the changes that a garment undergoes (e.g occlusions) when it fits on a target person. And forcing it to do so, results in artifacts such as texture squeezing, stretching, and garment tear, etc.

Contact IEEE to Subscribe

References

References is not available for this document.