Synthesizer: Rethinking Self-Attention for Transformer Models

Yi Tay, Dara Bahri, Donald Metzler, Da-Cheng Juan, Zhe Zhao, Che Zheng. Synthesizer: Rethinking Self-Attention for Transformer Models. In Marina Meila, Tong Zhang 0001, editors, Proceedings of the 38th International Conference on Machine Learning, ICML 2021, 18-24 July 2021, Virtual Event. Volume 139 of Proceedings of Machine Learning Research, pages 10183-10192, PMLR, 2021. [doi]

@inproceedings{TayBMJZZ21,
  title = {Synthesizer: Rethinking Self-Attention for Transformer Models},
  author = {Yi Tay and Dara Bahri and Donald Metzler and Da-Cheng Juan and Zhe Zhao and Che Zheng},
  year = {2021},
  url = {http://proceedings.mlr.press/v139/tay21a.html},
  researchr = {https://researchr.org/publication/TayBMJZZ21},
  cites = {0},
  citedby = {0},
  pages = {10183-10192},
  booktitle = {Proceedings of the 38th International Conference on Machine Learning, ICML 2021, 18-24 July 2021, Virtual Event},
  editor = {Marina Meila and Tong Zhang 0001},
  volume = {139},
  series = {Proceedings of Machine Learning Research},
  publisher = {PMLR},
}