Welcome to Francis Academic Press

Academic Journal of Computing & Information Science, 2021, 4(8); doi: 10.25236/AJCIS.2021.040816.

Research on Image Style Convolution Neural Network Migration Based on Deep Hybrid Generation Model

Author(s)

Junhu Zhou, Yujie Wang, Jiafang Gong, Guoqing Dong, Wentao Ma

Corresponding Author:
Junhu Zhou
Affiliation(s)

School of IMUT, Inner Mongolia University of Technology, Hohhot, Inner Mongolia, 010051, China

Abstract

The main content of image style transfer transforms the image style from one region to another. This task puts forward new needs for the traditional convolutional neural network architecture. Therefore, a deep hybrid generation model is usually used in the study of processing image style transfer. Image style transfer aims to transform the image into a new idea by image generation. This paper proposes an image-style convolution neural network migration model based on the deep mixing generation model based on background. The image quality is improved through image processing. The deep hybrid generation model mainly relies on to combine confrontation network generation and self-encoder. In this paper, unsupervised and supervised image style migrations are designed according to the different basic tasks of image style migration. On this basis, unsupervised image style migration of combative neural networks based on cyclic consistency and supervised image style migration of adversarial networks based on cross-domain self-encoder are proposed. This paper further improves the quality of created images by introducing an unsupervised and supervised image style migration standard dataset.

Keywords

Deep learning; Image style transfer; Convolutional neural network

Cite This Paper

Junhu Zhou, Yujie Wang, Jiafang Gong, Guoqing Dong, Wentao Ma. Research on Image Style Convolution Neural Network Migration Based on Deep Hybrid Generation Model. Academic Journal of Computing & Information Science (2021), Vol. 4, Issue 8: 83-89. https://doi.org/10.25236/AJCIS.2021.040816.

References

[1] Z. Liu et al., "Remove Appearance Shift for Ultrasound Image Segmentation via Fast and Universal Style Transfer," 2020 IEEE 17th International Symposium on Biomedical Imaging (ISBI), Iowa City, IA, USA, 2020, pp. 1824-1828.

[2] H. Ye, W. Liu and Y. Liu, "Image Style Transfer Method Based on Improved Style Loss Function," 2020 IEEE 9th Joint International Information Technology and Artificial Intelligence Conference (ITAIC), Chongqing, China, 2020, pp. 410-413.

[3] P. Li, L. Zhao, D. Xu, and D. Lu, "Incorporating Multiscale Contextual Loss for Image Style Transfer," 2018 IEEE 3rd International Conference on Image, Vision, and Computing (ICIVC), Chongqing, China, 2018, pp. 241-245.

[4] S. Chelaramani, A. Jha and A. M. Namboodiri, "Cross-Modal Style Transfer," 2018 25th IEEE International Conference on Image Processing (ICIP), Athens, Greece, 2018, pp. 2157-2161.

[5] Y. Zhang, Y. Zhang and W. Cai, "A Unified Framework for Generalizable Style Transfer: Style and Content Separation," in IEEE Transactions on Image Processing, vol. 29, pp. 4085-4098, 2020.

[6] Z. Lin et al., "Image Style Transfer Algorithm Based on Semantic Segmentation," in IEEE Access, vol. 9, pp. 54518-54529, 2021.

[7] J. J. Virtusio, A. Talavera, D. S. Tan, K. Hua, and A. Azcarraga, "Interactive Style Transfer: Towards Styling User-Specified Object," 2018 IEEE Visual Communications and Image Processing (VCIP), Taichung, Taiwan, 2018, pp. 1-4.

[8] Y. He, J. Li and A. Zhu, "Text-Based Image Style Transfer and Synthesis," 2019 International Conference on Document Analysis and Recognition Workshops (ICDARW), Sydney, NSW, Australia, 2019, pp. 43-48.