Synthesizer: Rethinking Self-Attention for Transformer Models

Yi Tay, Dara Bahri, Donald Metzler, Da-Cheng Juan, Zhe Zhao, Che Zheng. Synthesizer: Rethinking Self-Attention for Transformer Models. In Marina Meila, Tong Zhang 0001, editors, Proceedings of the 38th International Conference on Machine Learning, ICML 2021, 18-24 July 2021, Virtual Event. Volume 139 of Proceedings of Machine Learning Research, pages 10183-10192, PMLR, 2021. [doi]

Abstract

Abstract is missing.