Every Layer Counts: Multi-Layer Multi-Head Attention for Neural Machine Translation

Isaac K. E. Ampomah, Sally I. McClean, Zhiwei Lin 0002, Glenn I. Hawe. Every Layer Counts: Multi-Layer Multi-Head Attention for Neural Machine Translation. Prague Bull. Math. Linguistics, 115:51-82, 2020. [doi]

Abstract

Abstract is missing.