Fine-Tuning Pre-Trained Language Models Effectively by Optimizing Subnetworks Adaptively

Haojie Zhang, Ge Li 0001, Jia Li, Zhongjin Zhang, Yuqi Zhu, Zhi Jin. Fine-Tuning Pre-Trained Language Models Effectively by Optimizing Subnetworks Adaptively. In Sanmi Koyejo, S. Mohamed, A. Agarwal, Danielle Belgrave, K. Cho, A. Oh, editors, Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, NeurIPS 2022, New Orleans, LA, USA, November 28 - December 9, 2022. 2022. [doi]

@inproceedings{Zhang0LZZJ22,
  title = {Fine-Tuning Pre-Trained Language Models Effectively by Optimizing Subnetworks Adaptively},
  author = {Haojie Zhang and Ge Li 0001 and Jia Li and Zhongjin Zhang and Yuqi Zhu and Zhi Jin},
  year = {2022},
  url = {http://papers.nips.cc/paper_files/paper/2022/hash/869bfd807a513755bef25e3896a19a21-Abstract-Conference.html},
  researchr = {https://researchr.org/publication/Zhang0LZZJ22},
  cites = {0},
  citedby = {0},
  booktitle = {Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, NeurIPS 2022, New Orleans, LA, USA, November 28 - December 9, 2022},
  editor = {Sanmi Koyejo and S. Mohamed and A. Agarwal and Danielle Belgrave and K. Cho and A. Oh},
  isbn = {9781713871088},
}