1. Introduction
Recent years have witnessed the increasing demands of online shopping for fashion items. Online apparel and accessories sales in US are expected to reach 123 billion in 2022 from 72 billion in 2016 [1]. Despite the convenience online fashion shopping provides, consumers are concerned about how a particular fashion item in a product image would look on them when buying apparel online. Thus, allowing consumers to virtually try on clothes will not only enhance their shopping experience, transforming the way people shop for clothes, but also save cost for retailers. Motivated by this, various virtual fitting rooms/mirrors have been developed by different companies such as TriMirror, Fits Me, etc. However, the key enabling factor behind them is the use of 3D measurements of body shape, either captured directly by depth cameras [40] or inferred from a 2D image using training data [4], [45]. While these 3D modeling techniques enable realistic clothing simulations on the person, the high costs of installing hardwares and collecting 3D annotated data inhibit their large-scale deployment.
Virtual try-on results generated by our method. Each row shows a person virtually trying on different clothing items. Our model naturally renders the items onto a person while retaining her pose and preserving detailed characteristics of the target clothing items.