Multimodal Style Transfer via Graph Cuts

@article{Zhang2019MultimodalST,
  title={Multimodal Style Transfer via Graph Cuts},
  author={Yulun Zhang and Chen Fang and Yilin Wang and Zhaowen Wang and Zhe L. Lin and Yun Raymond Fu and Jimei Yang},
  journal={2019 IEEE/CVF International Conference on Computer Vision (ICCV)},
  year={2019},
  pages={5942-5950}
}
An assumption widely used in recent neural style transfer methods is that image styles can be described by global statics of deep features like Gram or covariance matrices. [...] Key Method MST explicitly considers the matching of semantic patterns in content and style images. Specifically, the style image features are clustered into sub-style components, which are matched with local content features under a graph cut formulation.Expand
Manifold Alignment for Semantically Aligned Style Transfer
TLDR
This paper makes a different assumption that local semantically aligned (or similar) regions between the content and style images should share similar style patterns and proposes a manifold alignment based style transfer method, MAST, which achieves appealing results in style transfer.
Diversified Patch-based Style Transfer with Shifted Style Normalization
TLDR
This work dives into the core style swapping process of patch-based style transfer and finds an operation called shifted style normalization (SSN), the most effective and efficient way to empower existing patch- based methods to generate diverse results for arbitrary styles.
Diverse Image Style Transfer via Invertible Cross-Space Mapping
Image style transfer aims to transfer the styles of artworks onto arbitrary photographs to create novel artistic images. Although style transfer is inherently an underdetermined problem, existing
StyleFormer: Real-time Arbitrary Style Transfer via Parametric Style Composition
In this work, we propose a new feed-forward arbitrary style transfer method, referred to as StyleFormer, which can simultaneously fulfill fine-grained style diversity and semantic content coherency.
Domain-Aware Universal Style Transfer
TLDR
A unified architecture, Domain-aware Style Transfer Networks (DSTN) that transfer not only the style but also the property of domain (i.e., domainness) from * Corresponding author a given reference image to reproduce content images with the styles from reference images is proposed.
DualAST: Dual Style-Learning Networks for Artistic Style Transfer
TLDR
A novel Dual Style-Learning Artistic Style Transfer (DualAST) framework to learn simultaneously both the holistic artist-style and specific artwork-style from a single style image, which confirms the superiority of this method.
Style Transfer with Target Feature Palette and Attention Coloring
TLDR
Qualitative and quantitative results show that the proposed stylized images exhibit state-of-the-art performance, with strength in preserving core structures and details of the content image, and provide in-depth analysis and insight on the proposed method via exhaustive ablation study.
Collaborative Distillation for Ultra-Resolution Universal Style Transfer
TLDR
A new knowledge distillation method for encoder-decoder based neural style transfer to reduce the convolutional filters and achieves ultra-resolution (over 40 megapixels) universal style transfer on a 12GB GPU for the first time.
Deep Style Transfer for Line Drawings
TLDR
This paper proposes to formulate the style transfer problem as a centerline stylization problem and solve it via a novel style-guided image-toimage translation network and significantly outperforms the existing methods both visually and quantitatively.
UVStyle-Net: Unsupervised Few-shot Learning of 3D Style Similarity Measure for B-Reps
TLDR
UVStyle-Net is proposed, a style similarity measure for B-Reps that leverages the style signals in the second order statistics of the activations in a pre-trained (unsupervised) 3D encoder, and learns their relative importance to a subjective end-user through few-shot learning.
...
1
2
3
...

References

SHOWING 1-10 OF 45 REFERENCES
Universal Style Transfer via Feature Transforms
TLDR
The key ingredient of the method is a pair of feature transforms, whitening and coloring, that are embedded to an image reconstruction network that reflects a direct matching of feature covariance of the content image to a given style image.
Separating Style and Content for Generalized Style Transfer
TLDR
This work attempts to separate the representations for styles and contents, and proposes a generalized style transfer network consisting of style encoder, content encoder), mixer and decoder, which allows simultaneous style transfer among multiple styles and can be deemed as a special 'multi-task' learning scenario.
Arbitrary Style Transfer with Deep Feature Reshuffle
TLDR
A novel method by reshuffling deep features of the style image for arbitrary style transfer that connects both global and local style losses respectively used by most parametric and non-parametric neural style transfer methods is introduced.
Learning Linear Transformations for Fast Arbitrary Style Transfer
TLDR
This work derives the form of transformation matrix theoretically and presents an arbitrary style transfer approach that learns the transformation matrix with a feed-forward network, which is highly efficient yet allows a flexible combination of multi-level styles while preserving content affinity during style transfer process.
Fast Patch-based Style Transfer of Arbitrary Style
TLDR
A simpler optimization objective based on local matching that combines the content structure and style textures in a single layer of the pretrained network is proposed that has desirable properties such as a simpler optimization landscape, intuitive parameter tuning, and consistent frame-by-frame performance on video.
Multimodal Transfer: A Hierarchical Deep Convolutional Neural Network for Fast Artistic Style Transfer
TLDR
A multimodal convolutional neural network is proposed that takes into consideration faithful representations of both color and luminance channels, and performs stylization hierarchically with multiple losses of increasing scales, and can perform style transfer in nearly real-time by performing much more sophisticated training offline.
Avatar-Net: Multi-scale Zero-Shot Style Transfer by Feature Decoration
TLDR
This paper proposes an efficient yet effective Avatar-Net that enables visually plausible multi-scale transfer for arbitrary style in real-time and demonstrates the state-of-the-art effectiveness and efficiency of the proposed method in generating high-quality stylized images.
Neural Style Transfer via Meta Networks
TLDR
A noval method to generate the specified network parameters through one feed-forward propagation in the meta networks for neural style transfer, which can handle an arbitrary new style within 19 milliseconds on one modern GPU card.
Deep Photo Style Transfer
TLDR
This paper introduces a deep-learning approach to photographic style transfer that handles a large variety of image content while faithfully transferring the reference style and constrain the transformation from the input to the output to be locally affine in colorspace.
Style Transfer Via Texture Synthesis
TLDR
This paper proposes a novel style transfer algorithm that extends the texture synthesis work of Kwatra et al. (2005), while aiming to get stylized images that are closer in quality to the CNN ones.
...
1
2
3
4
5
...