Corpus ID: 233169050

Facial Attribute Transformers for Precise and Robust Makeup Transfer

@article{Wan2021FacialAT,
  title={Facial Attribute Transformers for Precise and Robust Makeup Transfer},
  author={Zhaoyi Wan and Haoran Chen and Jielei Zhang and Wentao Jiang and Cong Yao and Jiebo Luo},
  journal={ArXiv},
  year={2021},
  volume={abs/2104.02894}
}
In this paper, we address the problem of makeup transfer, which aims at transplanting the makeup from the reference face to the source face while preserving the identity of the source. Existing makeup transfer methods have made notable progress in generating realistic makeup faces, but do not perform well in terms of color fidelity and spatial transformation. To tackle these issues, we propose a novel Facial Attribute Transformer (FAT) and its variant Spatial FAT for high-quality makeup… Expand
Analogous to Evolutionary Algorithm: Designing a Unified Sequence Model
TLDR
This work improves the existing transformer structure and proposes a more efficient EAT model, introduces the spatial-filling curve into the current vision transformer to sequence image data into a uniform sequential format, and designs task-related heads to deal with different tasks more flexibly. Expand

References

SHOWING 1-10 OF 36 REFERENCES
PSGAN: Pose and Expression Robust Spatial-Aware GAN for Customizable Makeup Transfer
TLDR
Pose and expression robust Spatial-aware GAN (PSGAN) is proposed, which not only achieves state-of-the-art results even when large pose and expression differences exist but also is able to perform partial and shade-controllable makeup transfer. Expand
BeautyGAN: Instance-level Facial Makeup Transfer with Deep Generative Adversarial Network
TLDR
A dual input/output Generative Adversarial Network that enables the network to learn translation on instance-level through unsupervised adversarial learning, and could generate visually pleasant makeup faces and accurate transferring results. Expand
Face Beautification: Beyond Makeup Transfer
TLDR
This work proposes a novel framework of face beautification that combines both style-based beauty representation and beauty score prediction into the process of beautification, and targets at many-to-many translation where multiple outputs can be defined by either different references or varying beauty scores. Expand
BeautyGlow: On-Demand Makeup Transfer Framework With Reversible Generative Network
TLDR
Experimental results show that the transfer quality of BeautyG Glow is comparable to the state-of-the-art methods, while the unique ability to manipulate latent vectors allows BeautyGlow to realize on-demand makeup transfer. Expand
Faster than Real-Time Facial Alignment: A 3D Spatial Transformer Network Approach in Unconstrained Poses
TLDR
This work presents a novel approach to simultaneously extract the 3D shape of the face and the semantically consistent 2D alignment through a 3D Spatial Transformer Network (3DSTN) to model both the camera projection matrix and the warping parameters of a3D model. Expand
LADN: Local Adversarial Disentangling Network for Facial Makeup and De-Makeup
TLDR
A local adversarial disentangling network for facial makeup and de-makeup that can distinguish whether the generated local image details are consistent with the corresponding regions in the given reference image in cross-image style transfer in an unsupervised setting is proposed. Expand
Stable and Controllable Neural Texture Synthesis and Style Transfer Using Histogram Losses
TLDR
This paper first gives a mathematical explanation of the source of instabilities in many previous approaches, and then improves these instabilities by using histogram losses to synthesize textures that better statistically match the exemplar. Expand
LinesToFacePhoto: Face Photo Generation From Lines With Conditional Self-Attention Generative Adversarial Networks
TLDR
This paper introduces a conditional self-attention mechanism to cGANs to capture long-range dependencies between different regions in faces, and builds a multi-scale discriminator that enforces the completeness of global structures and encourages fine details, thereby enhancing the realism of generated face images. Expand
A Closed-Form Solution to Universal Style Transfer
TLDR
A novel interpretation of feature transform is proposed by treating it as the optimal transport problem and a closed-form solution named Optimal Style Transfer (OST) is derived under this formulation by additionally considering the content loss of Gatys. Expand
PairedCycleGAN: Asymmetric Style Transfer for Applying and Removing Makeup
TLDR
This paper introduces an automatic method for editing a portrait photo so that the subject appears to be wearing makeup in the style of another person in a reference photo using a new framework of cycle-consistent generative adversarial networks. Expand
...
1
2
3
4
...