Memory Attentive Fusion: External Language Model Integration for Transformer-based Sequence-to-Sequence Model

Mana Ihori, Ryo Masumura, Naoki Makishima, Tomohiro Tanaka, Akihiko Takashima, Shota Orihashi. Memory Attentive Fusion: External Language Model Integration for Transformer-based Sequence-to-Sequence Model. In Brian Davis, Yvette Graham, John Kelleher, Yaji Sripada, editors, Proceedings of the 13th International Conference on Natural Language Generation, INLG 2020, Dublin, Ireland, December 15-18, 2020. pages 1-6, Association for Computational Linguistics, 2020. [doi]

Authors

Mana Ihori

This author has not been identified. Look up 'Mana Ihori' in Google

Ryo Masumura

This author has not been identified. Look up 'Ryo Masumura' in Google

Naoki Makishima

This author has not been identified. Look up 'Naoki Makishima' in Google

Tomohiro Tanaka

This author has not been identified. Look up 'Tomohiro Tanaka' in Google

Akihiko Takashima

This author has not been identified. Look up 'Akihiko Takashima' in Google

Shota Orihashi

This author has not been identified. Look up 'Shota Orihashi' in Google