Hidden State Variability of Pretrained Language Models Can Guide Computation Reduction for Transfer Learning

Shuo Xie, Jiahao Qiu, Ankita Pasad, Li Du, Qing Qu, Hongyuan Mei. Hidden State Variability of Pretrained Language Models Can Guide Computation Reduction for Transfer Learning. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Findings of the Association for Computational Linguistics: EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11, 2022. pages 5750-5768, Association for Computational Linguistics, 2022. [doi]

Authors

Shuo Xie

This author has not been identified. Look up 'Shuo Xie' in Google

Jiahao Qiu

This author has not been identified. Look up 'Jiahao Qiu' in Google

Ankita Pasad

This author has not been identified. Look up 'Ankita Pasad' in Google

Li Du

This author has not been identified. Look up 'Li Du' in Google

Qing Qu

This author has not been identified. Look up 'Qing Qu' in Google

Hongyuan Mei

This author has not been identified. Look up 'Hongyuan Mei' in Google