Style Transfer by Relaxed Optimal Transport and Self-Similarity

@article{Kolkin2019StyleTB,
  title={Style Transfer by Relaxed Optimal Transport and Self-Similarity},
  author={Nicholas I. Kolkin and Jason Salavon and Gregory Shakhnarovich},
  journal={2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2019},
  pages={10043-10052}
}
The goal of style transfer algorithms is to render the content of one image using the style of another. We propose Style Transfer by Relaxed Optimal Transport and Self-Similarity (STROTSS), a new optimization-based style transfer algorithm. We extend our method to allow user specified point-to-point or region-to-region control over visual similarity between the style image and the output. Such guidance can be used to either achieve a particular visual effect or correct errors made by… Expand
STALP: Style Transfer with Auxiliary Limited Pairing
TLDR
An approach to example‐based stylization of images that uses a single pair of a source image and its stylized counterpart, which better preserves important visual characteristics of the source style and can deliver temporally stable results without the need to explicitly handle temporal consistency. Expand
CAMS: Color-Aware Multi-Style Transfer
TLDR
A color-aware multi-style transfer method that generates aesthetically pleasing results while preserving the style-color correlation between style and generated images by introducing a simple but efficient modification to classic Gram matrix-based style transfer optimization. Expand
Consistent Video Style Transfer via Relaxation and Regularization
TLDR
This article identifies the cause of the conflict between style transfer and temporal consistency, and proposes to reconcile this contradiction by relaxing the objective function, so as to make the stylization loss term more robust to motions. Expand
Language-Driven Image Style Transfer
TLDR
Contrastive language visual artist (CLVA) is proposed that learns to extract visual semantics from style instructions and accomplish LDIST by the patch-wise style discriminator and compares contrastive pairs of content image and style instruction to improve the mutual relativeness between transfer results. Expand
Geometric Style Transfer
TLDR
This work introduces a neural architecture that supports transfer of geometric style, and provides user studies that show the quality of the output, and quantifies the importance of geometricstyle transfer to style recognition by humans. Expand
DualAST: Dual Style-Learning Networks for Artistic Style Transfer
TLDR
A novel Dual Style-Learning Artistic Style Transfer (DualAST) framework to learn simultaneously both the holistic artist-style and the specific artwork-style from a single style image, which confirms the superiority of this method. Expand
Less is More, Faithful Style Transfer without Content Loss
The dominant style transfer framework is based on separately defining ‘style loss’ and ‘content loss’, then finding an image that trades off between minimizing both. The challenge of operating inExpand
Wasserstein Style Transfer
TLDR
Since Gaussians are closed under Wasserstein barycenter, this allows a simple style transfer and style mixing and interpolation and shows how mixing different styles can be achieved using other geodesic metrics between gaussians such as the Fisher Rao metric. Expand
Style-Aware Normalized Loss for Improving Arbitrary Style Transfer
TLDR
Through investigation of the theoretical bounds of the AST style loss, a new loss is proposed that largely overcomes IST and is validated with over 80% relative improvement in style deception rate and 98% relatively higher preference in human evaluation. Expand
Efficient Style-Corpus Constrained Learning for Photorealistic Style Transfer
TLDR
A novel Style-Corpus Constrained Learning (SCCL) scheme is proposed to constrain the stylized image with the style consistency among different samples, which improves photorealism of stylization output. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 29 REFERENCES
Arbitrary Style Transfer with Deep Feature Reshuffle
TLDR
A novel method by reshuffling deep features of the style image for arbitrary style transfer that connects both global and local style losses respectively used by most parametric and non-parametric neural style transfer methods is introduced. Expand
Photorealistic Style Transfer with Screened Poisson Equation
TLDR
This paper proposes an approach that takes as input a stylized image and makes it more photorealistic, and relies on the Screened Poisson Equation, maintaining the fidelity of the stylization image while constraining the gradients to those of the original input image. Expand
Arbitrary Style Transfer in Real-Time with Adaptive Instance Normalization
TLDR
This paper presents a simple yet effective approach that for the first time enables arbitrary style transfer in real-time, comparable to the fastest existing approach, without the restriction to a pre-defined set of styles. Expand
Controlling Perceptual Factors in Neural Style Transfer
TLDR
The existing Neural Style Transfer method is extended to introduce control over spatial location, colour information and across spatial scale, enabling the combination of style information from multiple sources to generate new, perceptually appealing styles from existing ones. Expand
A Closed-form Solution to Photorealistic Image Stylization
TLDR
The results show that the proposed method generates photorealistic stylization outputs that are more preferred by human subjects as compared to those by the competing methods while running much faster. Expand
Decoder Network over Lightweight Reconstructed Feature for Fast Semantic Style Transfer
TLDR
This paper proposes a new framework for fast semantic style transfer that not only achieves competitive results as backward optimization methods but also is about two orders of magnitude faster. Expand
A Style-Aware Content Loss for Real-time HD Style Transfer
TLDR
A style-aware content loss is proposed, which is trained jointly with a deep encoder-decoder network for real-time, high-resolution stylization of images and videos and results show that this approach better captures the subtle nature in which a style affects content. Expand
Image Style Transfer Using Convolutional Neural Networks
TLDR
A Neural Algorithm of Artistic Style is introduced that can separate and recombine the image content and style of natural images and provide new insights into the deep image representations learned by Convolutional Neural Networks and demonstrate their potential for high level image synthesis and manipulation. Expand
Stable and Controllable Neural Texture Synthesis and Style Transfer Using Histogram Losses
TLDR
This paper first gives a mathematical explanation of the source of instabilities in many previous approaches, and then improves these instabilities by using histogram losses to synthesize textures that better statistically match the exemplar. Expand
Image quilting for texture synthesis and transfer
TLDR
This work uses quilting as a fast and very simple texture synthesis algorithm which produces surprisingly good results for a wide range of textures and extends the algorithm to perform texture transfer — rendering an object with a texture taken from a different object. Expand
...
1
2
3
...