Fine-Tuning Pre-Trained Language Models Effectively by Optimizing Subnetworks Adaptively

Haojie Zhang, Ge Li 0001, Jia Li, Zhongjin Zhang, Yuqi Zhu, Zhi Jin. Fine-Tuning Pre-Trained Language Models Effectively by Optimizing Subnetworks Adaptively. In Sanmi Koyejo, S. Mohamed, A. Agarwal, Danielle Belgrave, K. Cho, A. Oh, editors, Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, NeurIPS 2022, New Orleans, LA, USA, November 28 - December 9, 2022. 2022. [doi]

Abstract

Abstract is missing.