• Corpus ID: 229339966

Image Translation via Fine-grained Knowledge Transfer

  title={Image Translation via Fine-grained Knowledge Transfer},
  author={Xuanhong Chen and Ziang Liu and Ting Qiu and Bingbing Ni and Naiyuan Liu and Xiwei Hu and Yuhan Li},
Prevailing image-translation frameworks mostly seek to process images via the end-to-end style, which has achieved convincing results. Nonetheless, these methods lack interpretability and are not scalable on different imagetranslation tasks (e.g., style transfer, HDR, etc.). In this paper, we propose an interpretable knowledge-based imagetranslation framework, which realizes the image-translation through knowledge retrieval and transfer. In details, the framework constructs a plug-and-play and… 


Arbitrary Style Transfer in Real-Time with Adaptive Instance Normalization
This paper presents a simple yet effective approach that for the first time enables arbitrary style transfer in real-time, comparable to the fastest existing approach, without the restriction to a pre-defined set of styles.
A Closed-Form Solution to Universal Style Transfer
A novel interpretation of feature transform is proposed by treating it as the optimal transport problem and a closed-form solution named Optimal Style Transfer (OST) is derived under this formulation by additionally considering the content loss of Gatys.
Anisotropic Stroke Control for Multiple Artists Style Transfer
This work designs an Anisotropic Stroke Module (ASM) which realizes the dynamic adjustment of style-stroke between the non-trivial and the trivial regions and presents an novel Multi-Scale Projection Discriminator to realize the texture-level conditional generation.
Avatar-Net: Multi-scale Zero-Shot Style Transfer by Feature Decoration
This paper proposes an efficient yet effective Avatar-Net that enables visually plausible multi-scale transfer for arbitrary style in real-time and demonstrates the state-of-the-art effectiveness and efficiency of the proposed method in generating high-quality stylized images.
A Style-Aware Content Loss for Real-time HD Style Transfer
A style-aware content loss is proposed, which is trained jointly with a deep encoder-decoder network for real-time, high-resolution stylization of images and videos and results show that this approach better captures the subtle nature in which a style affects content.
StyleBank: An Explicit Representation for Neural Image Style Transfer
This work proposes StyleBank, which is composed of multiple convolution filter banks and each filter bank explicitly represents one style, for neural image style transfer, the first style transfer network that links back to traditional texton mapping methods, and hence provides new understanding on neural style transfer.
ComboGAN: Unrestrained Scalability for Image Domain Translation
This paper proposes a multi-component image translation model and training scheme which scales linearly - both in resource consumption and time required - with the number of domains and demonstrates its capabilities on a dataset of paintings by 14 different artists.
Pair-wise Exchangeable Feature Extraction for Arbitrary Style Transfer
This paper argues that only aligning the global statistics of deep features cannot always guarantee a good style transfer and proposes to jointly analyze the input image pair and extract common/exchangeable style features between the two.
Attention-Aware Multi-Stroke Style Transfer
This paper proposes to assemble self-attention mechanism into a style-agnostic reconstruction autoencoder framework, from which the attention map of a content image can be derived, and develops an attention-aware multi-stroke style transfer model.
Arbitrary Style Transfer with Deep Feature Reshuffle
A novel method by reshuffling deep features of the style image for arbitrary style transfer that connects both global and local style losses respectively used by most parametric and non-parametric neural style transfer methods is introduced.