Memory Attentive Fusion: External Language Model Integration for Transformer-based Sequence-to-Sequence Model

Mana Ihori, Ryo Masumura, Naoki Makishima, Tomohiro Tanaka, Akihiko Takashima, Shota Orihashi. Memory Attentive Fusion: External Language Model Integration for Transformer-based Sequence-to-Sequence Model. In Brian Davis, Yvette Graham, John Kelleher, Yaji Sripada, editors, Proceedings of the 13th International Conference on Natural Language Generation, INLG 2020, Dublin, Ireland, December 15-18, 2020. pages 1-6, Association for Computational Linguistics, 2020. [doi]

Abstract

Abstract is missing.