Loading [a11y]/accessibility-menu.js
Image Style Transfer Based on Deep Feature Rotation and Adaptive Instance Normalization | IEEE Conference Publication | IEEE Xplore

Image Style Transfer Based on Deep Feature Rotation and Adaptive Instance Normalization


Abstract:

Style transfer is an attractive point in recent years. This tech needs an image to provide its style, and a neural network is applied to transfer the style onto a content...Show More

Abstract:

Style transfer is an attractive point in recent years. This tech needs an image to provide its style, and a neural network is applied to transfer the style onto a content target. Most of the existing methods aim to obtain the weight parameters by training on a single image feature and continuously optimizing the network structure to improve the computational efficiency and image quality, but these methods only consider the image features from a certain perspective, which may lead to some information loss. In this paper, we add Deep Feature Rotation (DFR) to the AdaIN network, which enables us to generate multiple features from one image feature by rotation and train these features synthetically. By this method, we can perform comprehensive feature extraction on a stylized image to preserve more complete feature information. We have tried different combinations of angles and also compared them with other methods. The code is available at https://github.com/SP-FA/Style-Transfer-Based-on-DFR-and-AdaIN
Date of Conference: 17-19 October 2023
Date Added to IEEE Xplore: 21 December 2023
ISBN Information:
Conference Location: Zakopane, Poland

I. Introduction

In recent years, artificial intelligence can process more complex computer vision tasks with the development of deep learning. Recently, scholars have started to use artificial intelligence to create art. By letting neural networks learn from past artworks, it was found that computers can analyze artworks and create new ones [1]. Regarding artistic creation, a recent popular technique is image style transfer. This technique extracts features from one or more images designated as reference styles and performs stylization on input images defined as content images whose structure remains unchanged. The style transfer neuron network outputs an image that combines the content of the input image and the styles of one or more reference images [2]. With this technique, the styles of world-famous artists can be transferred to any existing image.

Contact IEEE to Subscribe

References

References is not available for this document.