Deformable Style Transfer
@article{Kim2020DeformableST, title={Deformable Style Transfer}, author={Sunnie Kim and Nicholas I. Kolkin and Jason Salavon and Gregory Shakhnarovich}, journal={ArXiv}, year={2020}, volume={abs/2003.11038} }
Both geometry and texture are fundamental aspects of visual style. Existing style transfer methods, however, primarily focus on texture, almost entirely ignoring geometry. We propose deformable style transfer (DST), an optimization-based approach that jointly stylizes the texture and geometry of a content image to better match a style image. Unlike previous geometry-aware stylization methods, our approach is neither restricted to a particular domain (such as human faces), nor does it require…
15 Citations
Rethinking Style Transfer: From Pixels to Parameterized Brushstrokes
- Computer Science2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2021
This work proposes a method to stylize images by optimizing parameterized brushstrokes instead of pixels and further introduces a simple differentiable rendering mechanism.
Learning to Transfer Visual Effects from Videos to Images
- ArtArXiv
- 2020
This work evaluates the problem of animating images by transferring spatio-temporal visual effects from a collection of videos, and shows interesting qualitative results which demonstrate objects undergoing atypical transformations, such as making a face melt or a deer bloom.
The Spatially-Correlative Loss for Various Image Translation Tasks
- Computer Science2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2021
This work proposes a novel spatially-correlative loss that is simple, efficient and yet effective for preserving scene structure consistency while supporting large appearance changes during unpaired image-to-image (I2I) translation, and introduces a new self-supervised learning method to explicitly learn spatially -correlative maps for each specific translation task.
Facial Attribute Transformers for Precise and Robust Makeup Transfer
- Computer Science2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)
- 2022
Extensive qualitative and quantitative experiments demonstrate the effectiveness and superiority of the proposed FATs in the following aspects: ensuring high-fidelity color transfer; allowing for geometric transformation of facial parts; handling facial variations and supporting high-resolution face generation.
DeepFaceEditing
- Computer ScienceACM Transactions on Graphics
- 2021
DeepFaceEditing is a structured disentanglement framework specifically designed for face images to support face generation and editing with disentangled control of geometry and appearance, and adopts a local-to-global approach to incorporate the face domain knowledge.
JoJoGAN: One Shot Face Stylization
- Computer ScienceArXiv
- 2021
Qualitative and quantitative evaluation show that JoJoGAN produces high quality high resolution images that vastly outperform the current state-of-the-art.
Learning to Warp for Style Transfer
- Computer Science2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2021
A neural network is proposed that learns a mapping from a 4D array of inter-feature distances to a non-parametric 2D warp field and extends the normal NST paradigm: although it can be used with a single exemplar, it also allows two style exemplars: one for texture and another geometry.
Industrial Style Transfer with Large-scale Geometric Warping and Content Preservation
- Computer ScienceArXiv
- 2022
The model, Industrial Style Transfer (InST), consists of large-scale geometric warping (LGW) and interest-consistency texture transfer (ICTT), and introduces a mask smoothness regularization term to prevent the abrupt changes of the details of the source product.
Learning to Warp for Style Transfer: Supplementary Material
- Computer Science
- 2021
This supplementary material comprises user control, including ways to overcome ostensible limitations, a description of the evaluation interface for similarity experiments, and raw results, accuracy and robustness tests, and the architecture of the warp network.
References
SHOWING 1-10 OF 32 REFERENCES
Image quilting for texture synthesis and transfer
- Computer ScienceSIGGRAPH
- 2001
This work uses quilting as a fast and very simple texture synthesis algorithm which produces surprisingly good results for a wide range of textures and extends the algorithm to perform texture transfer — rendering an object with a texture taken from a different object.
Style Transfer by Relaxed Optimal Transport and Self-Similarity
- Computer Science2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2019
The results indicate that for any desired level of content preservation, the proposed Style Transfer by Relaxed Optimal Transport and Self-Similarity (STROTSS), a new optimization-based style transfer algorithm, provides higher quality stylization than prior work.
Arbitrary Style Transfer with Deep Feature Reshuffle
- Computer Science2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition
- 2018
A novel method by reshuffling deep features of the style image for arbitrary style transfer that connects both global and local style losses respectively used by most parametric and non-parametric neural style transfer methods is introduced.
A Style-Aware Content Loss for Real-time HD Style Transfer
- Computer ScienceECCV
- 2018
A style-aware content loss is proposed, which is trained jointly with a deep encoder-decoder network for real-time, high-resolution stylization of images and videos and results show that this approach better captures the subtle nature in which a style affects content.
Image Style Transfer Using Convolutional Neural Networks
- Computer Science, Art2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2016
A Neural Algorithm of Artistic Style is introduced that can separate and recombine the image content and style of natural images and provide new insights into the deep image representations learned by Convolutional Neural Networks and demonstrate their potential for high level image synthesis and manipulation.
Semantic Style Transfer and Turning Two-Bit Doodles into Fine Artworks
- Computer ScienceArXiv
- 2016
This paper introduces a novel concept to augment such generative architectures with semantic annotations, either by manually authoring pixel labels or using existing solutions for semantic segmentation, resulting in a content-aware generative algorithm that offers meaningful control over the outcome.
Quantitative Evaluation of Style Transfer
- Computer ScienceArXiv
- 2018
It is likely that, for current methods, each style requires a different choice of weights to obtain the best results, so that automated weight setting methods are desirable.
Arbitrary Style Transfer in Real-Time with Adaptive Instance Normalization
- Computer Science2017 IEEE International Conference on Computer Vision (ICCV)
- 2017
This paper presents a simple yet effective approach that for the first time enables arbitrary style transfer in real-time, comparable to the fastest existing approach, without the restriction to a pre-defined set of styles.
Controlling Perceptual Factors in Neural Style Transfer
- Computer Science2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2017
The existing Neural Style Transfer method is extended to introduce control over spatial location, colour information and across spatial scale, enabling the combination of style information from multiple sources to generate new, perceptually appealing styles from existing ones.
Incorporating long-range consistency in CNN-based texture generation
- Computer ScienceICLR
- 2017
A simple modification to that representation of pair-wise products of features in a convolutional network is proposed which makes it possible to incorporate long-range structure into image generation, and to render images that satisfy various symmetry constraints.