Don't Stop Pretraining: Adapt Language Models to Domains and Tasks

Suchin Gururangan, Ana Marasovic, Swabha Swayamdipta, Kyle Lo, Iz Beltagy, Doug Downey, Noah A. Smith. Don't Stop Pretraining: Adapt Language Models to Domains and Tasks. In Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel R. Tetreault, editors, Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020, Online, July 5-10, 2020. pages 8342-8360, Association for Computational Linguistics, 2020. [doi]

@inproceedings{GururanganMSLBD20,
  title = {Don't Stop Pretraining: Adapt Language Models to Domains and Tasks},
  author = {Suchin Gururangan and Ana Marasovic and Swabha Swayamdipta and Kyle Lo and Iz Beltagy and Doug Downey and Noah A. Smith},
  year = {2020},
  url = {https://www.aclweb.org/anthology/2020.acl-main.740/},
  researchr = {https://researchr.org/publication/GururanganMSLBD20},
  cites = {0},
  citedby = {0},
  pages = {8342-8360},
  booktitle = {Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020, Online, July 5-10, 2020},
  editor = {Dan Jurafsky and Joyce Chai and Natalie Schluter and Joel R. Tetreault},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-952148-25-5},
}