ArtFlow: Unbiased Image Style Transfer via Reversible Neural Flows

@article{An2021ArtFlowUI,
  title={ArtFlow: Unbiased Image Style Transfer via Reversible Neural Flows},
  author={Jie An and Siyu Huang and Yibing Song and Dejing Dou and Wei Liu and Jiebo Luo},
  journal={2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2021},
  pages={862-871}
}
  • Jie An, Siyu Huang, +3 authors Jiebo Luo
  • Published 31 March 2021
  • Computer Science, Engineering
  • 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Universal style transfer retains styles from reference images in content images. While existing methods have achieved state-of-the-art style transfer performance, they are not aware of the content leak phenomenon that the image content may corrupt after several rounds of stylization process. In this paper, we propose ArtFlow to prevent content leak during universal style transfer. ArtFlow consists of reversible neural flows and an unbiased feature transfer module. It supports both forward and… 
StyTr^2: Unbiased Image Style Transfer with Transformers
TLDR
This work proposes a transformer-based approach, namely StyTr, which analyzes the deficiency of existing positional encoding methods and proposes the content-aware positional encoding (CAPE) which is scale-invariant and more suitable for image style transfer task.
Texture Reformer: Towards Fast and Universal Interactive Texture Transfer
TLDR
The texture reformer is presented, a fast and universal neural-based framework for interactive texture transfer with user-specified guidance that not only achieves higher quality results but, remarkably, also is 2-5 orders of magnitude faster.
Sketch to portrait generation with generative adversarial networks and edge constraint

References

SHOWING 1-10 OF 67 REFERENCES
Universal Style Transfer via Feature Transforms
TLDR
The key ingredient of the method is a pair of feature transforms, whitening and coloring, that are embedded to an image reconstruction network that reflects a direct matching of feature covariance of the content image to a given style image.
Real-time Universal Style Transfer on High-resolution Images via Zero-channel Pruning
TLDR
This work proposes a lightweight alternative architecture - ArtNet, which is based on GoogLeNet, and later pruned by a novel channel pruning method named Zero-channel Pruning specially designed for style transfer approaches, and proposes a theoretically sound sandwich swap transform (S2) module to transfer deep features.
Multi-style Generative Network for Real-time Transfer
TLDR
MSG-Net is the first to achieve real-time brush-size control in a purely feed-forward manner for style transfer and is compatible with most existing techniques including content-style interpolation, color-preserving, spatial control and brush stroke size control.
Multimodal Transfer: A Hierarchical Deep Convolutional Neural Network for Fast Artistic Style Transfer
TLDR
A multimodal convolutional neural network is proposed that takes into consideration faithful representations of both color and luminance channels, and performs stylization hierarchically with multiple losses of increasing scales, and can perform style transfer in nearly real-time by performing much more sophisticated training offline.
Dynamic Instance Normalization for Arbitrary Style Transfer
TLDR
The proposed Dynamic Instance Normalization (DIN) provides flexible support for state-of-the-art convolutional operations, and thus triggers novel functionalities, such as uniform-stroke placement for non-natural images and automatic spatial-stroke control.
Arbitrary Style Transfer with Deep Feature Reshuffle
TLDR
A novel method by reshuffling deep features of the style image for arbitrary style transfer that connects both global and local style losses respectively used by most parametric and non-parametric neural style transfer methods is introduced.
Photorealistic Style Transfer via Wavelet Transforms
TLDR
This work proposes a wavelet corrected transfer based on whitening and coloring transforms (WCT2) that allows features to preserve their structural information and statistical properties of VGG feature space during stylization and provides a stable video stylization without temporal constraints.
StyleBank: An Explicit Representation for Neural Image Style Transfer
TLDR
This work proposes StyleBank, which is composed of multiple convolution filter banks and each filter bank explicitly represents one style, for neural image style transfer, the first style transfer network that links back to traditional texton mapping methods, and hence provides new understanding on neural style transfer.
Collaborative Distillation for Ultra-Resolution Universal Style Transfer
TLDR
A new knowledge distillation method for encoder-decoder based neural style transfer to reduce the convolutional filters and achieves ultra-resolution (over 40 megapixels) universal style transfer on a 12GB GPU for the first time.
Learning Linear Transformations for Fast Arbitrary Style Transfer
TLDR
This work derives the form of transformation matrix theoretically and presents an arbitrary style transfer approach that learns the transformation matrix with a feed-forward network, which is highly efficient yet allows a flexible combination of multi-level styles while preserving content affinity during style transfer process.
...
1
2
3
4
5
...