The following publications are possibly variants of this publication:
- Arbitrary Style Transfer with Parallel Self-AttentionTiange Zhang, Ying Gao, Feng Gao, Lin Qi, Junyu Dong. icpr 2021: 1406-1413 [doi]
- Pyramid style-attentional network for arbitrary style transferGaoming Yang, Shicheng Zhang, Xianjin Fang, Ji Zhang. mta, 83(5):13483-13502, February 2024. [doi]
- GCSANet: Arbitrary Style Transfer With Global Context Self-Attentional NetworkZhongyu Bai, Hongli Xu, Xiangyue Zhang, Qichuan Ding. tmm, 26:1407-1420, 2024. [doi]
- Multi-Attention Network for Arbitrary Style TransferSihui Hua, DongDong Zhang. iconip 2021: 390-401 [doi]
- CSAST: Content self-supervised and style contrastive learning for arbitrary style transferYuqi Zhang, Yingjie Tian, Junjie Hou. NN, 164:146-155, July 2023. [doi]
- Arbitrary Style Transfer With Fused Convolutional Block Attention ModulesHaitao Xin, Li Li. access, 11:44977-44988, 2023. [doi]
- Multi-scale Attention Enhancement for Arbitrary Style Transfer via Contrast LearningLei Zhou, Tao Zhang. iccai 2023: 650-656 [doi]
- AdaAttN: Revisit Attention Mechanism in Arbitrary Neural Style TransferSonghua Liu, Tianwei Lin, Dongliang He, Fu Li, Meiling Wang, Xin Li, Zhengxing Sun, Qian Li, Errui Ding. iccv 2021: 6629-6638 [doi]