CT-Net: Complementary Transfering Network for Garment Transfer with Arbitrary Geometric Changes


Abstract in English

Garment transfer shows great potential in realistic applications with the goal of transfering outfits across different people images. However, garment transfer between images with heavy misalignments or severe occlusions still remains as a challenge. In this work, we propose Complementary Transfering Network (CT-Net) to adaptively model different levels of geometric changes and transfer outfits between different people. In specific, CT-Net consists of three modules: 1) A complementary warping module first estimates two complementary warpings to transfer the desired clothes in different granularities. 2) A layout prediction module is proposed to predict the target layout, which guides the preservation or generation of the body parts in the synthesized images. 3) A dynamic fusion module adaptively combines the advantages of the complementary warpings to render the garment transfer results. Extensive experiments conducted on DeepFashion dataset demonstrate that our network synthesizes high-quality garment transfer images and significantly outperforms the state-of-art methods both qualitatively and quantitatively.

Download