Towards Anytime Fine-tuning: Continually Pre-trained Language Models with Hypernetwork Prompts

Gangwei Jiang, Caigao Jiang, Siqiao Xue, James Zhang, Jun Zhou, Defu Lian, Ying Wei 0001. Towards Anytime Fine-tuning: Continually Pre-trained Language Models with Hypernetwork Prompts. In Houda Bouamor, Juan Pino 0001, Kalika Bali, editors, Findings of the Association for Computational Linguistics: EMNLP 2023, Singapore, December 6-10, 2023. pages 12081-12095, Association for Computational Linguistics, 2023. [doi]

@inproceedings{JiangJXZZL023,
  title = {Towards Anytime Fine-tuning: Continually Pre-trained Language Models with Hypernetwork Prompts},
  author = {Gangwei Jiang and Caigao Jiang and Siqiao Xue and James Zhang and Jun Zhou and Defu Lian and Ying Wei 0001},
  year = {2023},
  url = {https://aclanthology.org/2023.findings-emnlp.808},
  researchr = {https://researchr.org/publication/JiangJXZZL023},
  cites = {0},
  citedby = {0},
  pages = {12081-12095},
  booktitle = {Findings of the Association for Computational Linguistics: EMNLP 2023, Singapore, December 6-10, 2023},
  editor = {Houda Bouamor and Juan Pino 0001 and Kalika Bali},
  publisher = {Association for Computational Linguistics},
  isbn = {979-8-89176-061-5},
}