Every Layer Counts: Multi-Layer Multi-Head Attention for Neural Machine Translation

Isaac K. E. Ampomah, Sally I. McClean, Zhiwei Lin 0002, Glenn I. Hawe. Every Layer Counts: Multi-Layer Multi-Head Attention for Neural Machine Translation. Prague Bull. Math. Linguistics, 115:51-82, 2020. [doi]

@article{AmpomahMLH20,
  title = {Every Layer Counts: Multi-Layer Multi-Head Attention for Neural Machine Translation},
  author = {Isaac K. E. Ampomah and Sally I. McClean and Zhiwei Lin 0002 and Glenn I. Hawe},
  year = {2020},
  url = {http://ufal.mff.cuni.cz/pbml/115/art-ampomah-et-al.pdf},
  researchr = {https://researchr.org/publication/AmpomahMLH20},
  cites = {0},
  citedby = {0},
  journal = {Prague Bull. Math. Linguistics},
  volume = {115},
  pages = {51-82},
}