Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models

Dan Iter, Kelvin Guu, Larry Lansing, Dan Jurafsky. Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models. In Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel R. Tetreault, editors, Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020, Online, July 5-10, 2020. pages 4859-4870, Association for Computational Linguistics, 2020. [doi]

Authors

Dan Iter

This author has not been identified. Look up 'Dan Iter' in Google

Kelvin Guu

This author has not been identified. Look up 'Kelvin Guu' in Google

Larry Lansing

This author has not been identified. Look up 'Larry Lansing' in Google

Dan Jurafsky

This author has not been identified. Look up 'Dan Jurafsky' in Google