Every Layer Counts: Multi-Layer Multi-Head Attention for Neural Machine Translation

Isaac K. E. Ampomah, Sally I. McClean, Zhiwei Lin 0002, Glenn I. Hawe. Every Layer Counts: Multi-Layer Multi-Head Attention for Neural Machine Translation. Prague Bull. Math. Linguistics, 115:51-82, 2020. [doi]

References

No references recorded for this publication.

Cited by

No citations of this publication recorded.