PG-VTON: A Novel Image-Based Virtual Try-On Method via Progressive Inference Paradigm | IEEE Journals & Magazine | IEEE Xplore

PG-VTON: A Novel Image-Based Virtual Try-On Method via Progressive Inference Paradigm


Abstract:

Virtual try-on is a promising computer vision topic with a high commercial value wherein a new garment is visually worn on a person with a photo-realistic effect. Previou...Show More

Abstract:

Virtual try-on is a promising computer vision topic with a high commercial value wherein a new garment is visually worn on a person with a photo-realistic effect. Previous studies conduct their shape and content inference at one stage, employing a single-scale warping mechanism and a relatively unsophisticated content inference mechanism. These approaches have led to suboptimal results in terms of garment warping and skin reservation under challenging try-on scenarios. To address these limitations, we propose a novel virtual try-on method via progressive inference paradigm (PGVTON) that leverages a top-down inference pipeline and a general garment try-on strategy. Specifically, we propose a robust try-on parsing inference method by disentangling semantic categories and introducing consistency. Exploiting the try-on parsing as the shape guidance, we implement the garment try-on via warping-mapping-composition. To facilitate adaptation to a wide range of try-on scenarios, we adopt a covering more and selecting one warping strategy and explicitly distinguish tasks based on alignment. Additionally, we regulate StyleGAN2 to implement re-naked skin inpainting, conditioned on the target skin shape and spatial-agnostic skin features. Experiments demonstrate that our method has state-of-the-art performance under two challenging scenarios.
Published in: IEEE Transactions on Multimedia ( Volume: 26)
Page(s): 6595 - 6608
Date of Publication: 16 January 2024

ISSN Information:

Funding Agency:


I. Introduction

Virtual try-on is a promising topic for commercial applications in computer vision. The image-based virtual try-on has no requirements for professional 3 d modeling of the person [1], [2] and garment [3]. It still preset a photo-realistic wearing effect only conditioned on the person and garment image. However, as the variability of person and garment increases, the content inference and garment warping employed in previous studies have encountered challenges when faced with difficult try-on scenarios. In this paper, we propose a progressive inference paradigm of virtual try-on and employ advanced strategies of garment try-on and skin inpainting to enhance the try-on realism even in the presence of distinct garment categories or complex poses.

Contact IEEE to Subscribe

References

References is not available for this document.