Simulated multiple reference training improves low-resource machine translation

Huda Khayrallah, Brian Thompson, Matt Post, Philipp Koehn. Simulated multiple reference training improves low-resource machine translation. In Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu, editors, Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020, Online, November 16-20, 2020. pages 82-89, Association for Computational Linguistics, 2020. [doi]

@inproceedings{KhayrallahTPK20,
  title = {Simulated multiple reference training improves low-resource machine translation},
  author = {Huda Khayrallah and Brian Thompson and Matt Post and Philipp Koehn},
  year = {2020},
  url = {https://www.aclweb.org/anthology/2020.emnlp-main.7/},
  researchr = {https://researchr.org/publication/KhayrallahTPK20},
  cites = {0},
  citedby = {0},
  pages = {82-89},
  booktitle = {Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020, Online, November 16-20, 2020},
  editor = {Bonnie Webber and Trevor Cohn and Yulan He and Yang Liu},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-952148-60-6},
}