Deformable Style Transfer

@article{Kim2020DeformableST,
  title={Deformable Style Transfer},
  author={Sunnie Kim and Nicholas I. Kolkin and Jason Salavon and Gregory Shakhnarovich},
  journal={ArXiv},
  year={2020},
  volume={abs/2003.11038}
}
Both geometry and texture are fundamental aspects of visual style. Existing style transfer methods, however, primarily focus on texture, almost entirely ignoring geometry. We propose deformable style transfer (DST), an optimization-based approach that jointly stylizes the texture and geometry of a content image to better match a style image. Unlike previous geometry-aware stylization methods, our approach is neither restricted to a particular domain (such as human faces), nor does it require… 
Rethinking Style Transfer: From Pixels to Parameterized Brushstrokes
TLDR
This work proposes a method to stylize images by optimizing parameterized brushstrokes instead of pixels and further introduces a simple differentiable rendering mechanism.
Learning to Transfer Visual Effects from Videos to Images
TLDR
This work evaluates the problem of animating images by transferring spatio-temporal visual effects from a collection of videos, and shows interesting qualitative results which demonstrate objects undergoing atypical transformations, such as making a face melt or a deer bloom.
The Spatially-Correlative Loss for Various Image Translation Tasks
TLDR
This work proposes a novel spatially-correlative loss that is simple, efficient and yet effective for preserving scene structure consistency while supporting large appearance changes during unpaired image-to-image (I2I) translation, and introduces a new self-supervised learning method to explicitly learn spatially -correlative maps for each specific translation task.
Facial Attribute Transformers for Precise and Robust Makeup Transfer
TLDR
Extensive qualitative and quantitative experiments demonstrate the effectiveness and superiority of the proposed FATs in the following aspects: ensuring high-fidelity color transfer; allowing for geometric transformation of facial parts; handling facial variations and supporting high-resolution face generation.
DeepFaceEditing
TLDR
DeepFaceEditing is a structured disentanglement framework specifically designed for face images to support face generation and editing with disentangled control of geometry and appearance, and adopts a local-to-global approach to incorporate the face domain knowledge.
JoJoGAN: One Shot Face Stylization
TLDR
Qualitative and quantitative evaluation show that JoJoGAN produces high quality high resolution images that vastly outperform the current state-of-the-art.
Learning to Warp for Style Transfer
TLDR
A neural network is proposed that learns a mapping from a 4D array of inter-feature distances to a non-parametric 2D warp field and extends the normal NST paradigm: although it can be used with a single exemplar, it also allows two style exemplars: one for texture and another geometry.
Industrial Style Transfer with Large-scale Geometric Warping and Content Preservation
TLDR
The model, Industrial Style Transfer (InST), consists of large-scale geometric warping (LGW) and interest-consistency texture transfer (ICTT), and introduces a mask smoothness regularization term to prevent the abrupt changes of the details of the source product.
Learning to Warp for Style Transfer: Supplementary Material
TLDR
This supplementary material comprises user control, including ways to overcome ostensible limitations, a description of the evaluation interface for similarity experiments, and raw results, accuracy and robustness tests, and the architecture of the warp network.
...
1
2
...

References

SHOWING 1-10 OF 32 REFERENCES
Image quilting for texture synthesis and transfer
TLDR
This work uses quilting as a fast and very simple texture synthesis algorithm which produces surprisingly good results for a wide range of textures and extends the algorithm to perform texture transfer — rendering an object with a texture taken from a different object.
Style Transfer by Relaxed Optimal Transport and Self-Similarity
TLDR
The results indicate that for any desired level of content preservation, the proposed Style Transfer by Relaxed Optimal Transport and Self-Similarity (STROTSS), a new optimization-based style transfer algorithm, provides higher quality stylization than prior work.
Arbitrary Style Transfer with Deep Feature Reshuffle
TLDR
A novel method by reshuffling deep features of the style image for arbitrary style transfer that connects both global and local style losses respectively used by most parametric and non-parametric neural style transfer methods is introduced.
A Style-Aware Content Loss for Real-time HD Style Transfer
TLDR
A style-aware content loss is proposed, which is trained jointly with a deep encoder-decoder network for real-time, high-resolution stylization of images and videos and results show that this approach better captures the subtle nature in which a style affects content.
Image Style Transfer Using Convolutional Neural Networks
TLDR
A Neural Algorithm of Artistic Style is introduced that can separate and recombine the image content and style of natural images and provide new insights into the deep image representations learned by Convolutional Neural Networks and demonstrate their potential for high level image synthesis and manipulation.
Semantic Style Transfer and Turning Two-Bit Doodles into Fine Artworks
TLDR
This paper introduces a novel concept to augment such generative architectures with semantic annotations, either by manually authoring pixel labels or using existing solutions for semantic segmentation, resulting in a content-aware generative algorithm that offers meaningful control over the outcome.
Quantitative Evaluation of Style Transfer
TLDR
It is likely that, for current methods, each style requires a different choice of weights to obtain the best results, so that automated weight setting methods are desirable.
Arbitrary Style Transfer in Real-Time with Adaptive Instance Normalization
TLDR
This paper presents a simple yet effective approach that for the first time enables arbitrary style transfer in real-time, comparable to the fastest existing approach, without the restriction to a pre-defined set of styles.
Controlling Perceptual Factors in Neural Style Transfer
TLDR
The existing Neural Style Transfer method is extended to introduce control over spatial location, colour information and across spatial scale, enabling the combination of style information from multiple sources to generate new, perceptually appealing styles from existing ones.
Incorporating long-range consistency in CNN-based texture generation
TLDR
A simple modification to that representation of pair-wise products of features in a convolutional network is proposed which makes it possible to incorporate long-range structure into image generation, and to render images that satisfy various symmetry constraints.
...
1
2
3
4
...