Self-training Improves Pre-training for Natural Language Understanding

Jingfei Du, Edouard Grave, Beliz Gunel, Vishrav Chaudhary, Onur Celebi, Michael Auli, Veselin Stoyanov, Alexis Conneau. Self-training Improves Pre-training for Natural Language Understanding. In Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tür, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty 0002, Yichao Zhou, editors, Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2021, Online, June 6-11, 2021. pages 5408-5418, Association for Computational Linguistics, 2021. [doi]

Authors

Jingfei Du

This author has not been identified. Look up 'Jingfei Du' in Google

Edouard Grave

This author has not been identified. Look up 'Edouard Grave' in Google

Beliz Gunel

This author has not been identified. Look up 'Beliz Gunel' in Google

Vishrav Chaudhary

This author has not been identified. Look up 'Vishrav Chaudhary' in Google

Onur Celebi

This author has not been identified. Look up 'Onur Celebi' in Google

Michael Auli

This author has not been identified. Look up 'Michael Auli' in Google

Veselin Stoyanov

This author has not been identified. Look up 'Veselin Stoyanov' in Google

Alexis Conneau

This author has not been identified. Look up 'Alexis Conneau' in Google