Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models

Dan Iter, Kelvin Guu, Larry Lansing, Dan Jurafsky. Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models. In Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel R. Tetreault, editors, Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020, Online, July 5-10, 2020. pages 4859-4870, Association for Computational Linguistics, 2020. [doi]

@inproceedings{IterGLJ20,
  title = {Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models},
  author = {Dan Iter and Kelvin Guu and Larry Lansing and Dan Jurafsky},
  year = {2020},
  url = {https://www.aclweb.org/anthology/2020.acl-main.439/},
  researchr = {https://researchr.org/publication/IterGLJ20},
  cites = {0},
  citedby = {0},
  pages = {4859-4870},
  booktitle = {Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020, Online, July 5-10, 2020},
  editor = {Dan Jurafsky and Joyce Chai and Natalie Schluter and Joel R. Tetreault},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-952148-25-5},
}