The following publications are possibly variants of this publication:
- Alternating Language Modeling for Cross-Lingual Pre-TrainingJian Yang, Shuming Ma, DongDong Zhang, Shuangzhi Wu, Zhoujun Li, Ming Zhou 0001. AAAI 2020: 9386-9393 [doi]
- Cross-Lingual Natural Language Generation via Pre-TrainingZewen Chi, Li Dong 0004, Furu Wei, Wenhui Wang, Xian-Ling Mao, Heyan Huang. AAAI 2020: 7570-7577 [doi]
- Allocating Large Vocabulary Capacity for Cross-Lingual Language Model Pre-TrainingBo Zheng, Li Dong 0004, Shaohan Huang, Saksham Singhal, Wanxiang Che, Ting Liu 0001, Xia Song, Furu Wei. emnlp 2021: 3203-3215 [doi]
- InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-TrainingZewen Chi, Li Dong 0004, Furu Wei, Nan Yang 0002, Saksham Singhal, Wenhui Wang, Xia Song, Xian-Ling Mao, Heyan Huang, Ming Zhou 0001. naacl 2021: 3576-3588 [doi]
- Unifying Cross-Lingual and Cross-Modal Modeling Towards Weakly Supervised Multilingual Vision-Language Pre-trainingZejun Li, Zhihao Fan, Jingjing Chen, Qi Zhang, Xuanjing Huang 0001, Zhongyu Wei. acl 2023: 5939-5958 [doi]