Hidden State Variability of Pretrained Language Models Can Guide Computation Reduction for Transfer Learning

Shuo Xie, Jiahao Qiu, Ankita Pasad, Li Du, Qing Qu, Hongyuan Mei. Hidden State Variability of Pretrained Language Models Can Guide Computation Reduction for Transfer Learning. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Findings of the Association for Computational Linguistics: EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11, 2022. pages 5750-5768, Association for Computational Linguistics, 2022. [doi]

Abstract

Abstract is missing.