Michael Hassid, Hao Peng, Daniel Rotem, Jungo Kasai, Ivan Montero, Noah A. Smith, Roy Schwartz 0001. How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Findings of the Association for Computational Linguistics: EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11, 2022. pages 1403-1416, Association for Computational Linguistics, 2022. [doi]
Abstract is missing.