Hidden State Variability of Pretrained Language Models Can Guide Computation Reduction for Transfer Learning

Shuo Xie, Jiahao Qiu, Ankita Pasad, Li Du, Qing Qu, Hongyuan Mei. Hidden State Variability of Pretrained Language Models Can Guide Computation Reduction for Transfer Learning. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Findings of the Association for Computational Linguistics: EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11, 2022. pages 5750-5768, Association for Computational Linguistics, 2022. [doi]

@inproceedings{XieQPDQM22,
  title = {Hidden State Variability of Pretrained Language Models Can Guide Computation Reduction for Transfer Learning},
  author = {Shuo Xie and Jiahao Qiu and Ankita Pasad and Li Du and Qing Qu and Hongyuan Mei},
  year = {2022},
  url = {https://aclanthology.org/2022.findings-emnlp.422},
  researchr = {https://researchr.org/publication/XieQPDQM22},
  cites = {0},
  citedby = {0},
  pages = {5750-5768},
  booktitle = {Findings of the Association for Computational Linguistics: EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11, 2022},
  editor = {Yoav Goldberg and Zornitsa Kozareva and Yue Zhang},
  publisher = {Association for Computational Linguistics},
}