Autoregressive Knowledge Distillation through Imitation Learning

Alexander Lin, Jeremy Wohlwend, Howard Chen, Tao Lei 0001. Autoregressive Knowledge Distillation through Imitation Learning. In Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu, editors, Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020, Online, November 16-20, 2020. pages 6121-6133, Association for Computational Linguistics, 2020. [doi]

Abstract

Abstract is missing.