Rethink Arbitrary Style Transfer with Transformer and Contrastive Learning
arxiv(2024)
摘要
Arbitrary style transfer holds widespread attention in research and boasts
numerous practical applications. The existing methods, which either employ
cross-attention to incorporate deep style attributes into content attributes or
use adaptive normalization to adjust content features, fail to generate
high-quality stylized images. In this paper, we introduce an innovative
technique to improve the quality of stylized images. Firstly, we propose Style
Consistency Instance Normalization (SCIN), a method to refine the alignment
between content and style features. In addition, we have developed an
Instance-based Contrastive Learning (ICL) approach designed to understand the
relationships among various styles, thereby enhancing the quality of the
resulting stylized images. Recognizing that VGG networks are more adept at
extracting classification features and need to be better suited for capturing
style features, we have also introduced the Perception Encoder (PE) to capture
style features. Extensive experiments demonstrate that our proposed method
generates high-quality stylized images and effectively prevents artifacts
compared with the existing state-of-the-art methods.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要