Code-switched Language Models Using Dual RNNs and Same-Source Pretraining

Saurabh Garg, Tanmay Parekh, Preethi Jyothi. Code-switched Language Models Using Dual RNNs and Same-Source Pretraining. In Ellen Riloff, David Chiang 0001, Julia Hockenmaier, Jun'ichi Tsujii, editors, Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, October 31 - November 4, 2018. pages 3078-3083, Association for Computational Linguistics, 2018. [doi]

Abstract

Abstract is missing.