• Computer Science
  • Published in ArXiv 2019

Conditional Neural Style Transfer with Peer-Regularized Feature Transform

@article{Svoboda2019ConditionalNS,
  title={Conditional Neural Style Transfer with Peer-Regularized Feature Transform},
  author={Jan Svoboda and Asha Anoosheh and Christian Osendorfer and Jonathan Masci},
  journal={ArXiv},
  year={2019},
  volume={abs/1906.02913}
}
This paper introduces a neural style transfer model to conditionally generate a stylized image using only a set of examples describing the desired style. The proposed solution produces high-quality images even in the zero-shot setting and allows for greater freedom in changing the content geometry. This is thanks to the introduction of a novel Peer-Regularization Layer that recomposes style in latent space by means of a custom graph convolutional layer aiming at separating style and content… CONTINUE READING

References

Publications referenced by this paper.
SHOWING 1-10 OF 37 REFERENCES

Generative Adversarial Nets

VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

Improved Texture Networks: Maximizing Quality and Diversity in Feed-Forward Stylization and Texture Synthesis

  • 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2017
VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

Arbitrary Style Transfer in Real-Time with Adaptive Instance Normalization

  • 2017 IEEE International Conference on Computer Vision (ICCV)
  • 2017
VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

Avatar-Net: Multi-scale Zero-Shot Style Transfer by Feature Decoration

  • 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition
  • 2018
VIEW 2 EXCERPTS