A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training

Nitay Calderon, Subhabrata Mukherjee, Roi Reichart, Amir Kantor. A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training. In Anna Rogers, Jordan L. Boyd-Graber, Naoaki Okazaki, editors, Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2023, Toronto, Canada, July 9-14, 2023. pages 14632-14659, Association for Computational Linguistics, 2023. [doi]

Authors

Nitay Calderon

This author has not been identified. Look up 'Nitay Calderon' in Google

Subhabrata Mukherjee

This author has not been identified. Look up 'Subhabrata Mukherjee' in Google

Roi Reichart

This author has not been identified. Look up 'Roi Reichart' in Google

Amir Kantor

This author has not been identified. Look up 'Amir Kantor' in Google