Too Much in Common: Shifting of Embeddings in Transformer Language Models and its Implications

Daniel Bis, Maksim Podkorytov, Xiuwen Liu. Too Much in Common: Shifting of Embeddings in Transformer Language Models and its Implications. In Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tür, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty 0002, Yichao Zhou, editors, Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2021, Online, June 6-11, 2021. pages 5117-5130, Association for Computational Linguistics, 2021. [doi]

Abstract

Abstract is missing.