PGSG at SemEval-2020 Task 12: BERT-LSTM with Tweets' Pretrained Model and Noisy Student Training Method

Bao-Tran Pham-Hong, Setu Chokshi. PGSG at SemEval-2020 Task 12: BERT-LSTM with Tweets' Pretrained Model and Noisy Student Training Method. In Aurélie Herbelot, Xiaodan Zhu, Alexis Palmer, Nathan Schneider, Jonathan May, Ekaterina Shutova, editors, Proceedings of the Fourteenth Workshop on Semantic Evaluation, SemEval@COLING 2020, Barcelona (online), December 12-13, 2020. pages 2111-2116, International Committee for Computational Linguistics, 2020. [doi]

@inproceedings{Pham-HongC20,
  title = {PGSG at SemEval-2020 Task 12: BERT-LSTM with Tweets' Pretrained Model and Noisy Student Training Method},
  author = {Bao-Tran Pham-Hong and Setu Chokshi},
  year = {2020},
  url = {https://www.aclweb.org/anthology/2020.semeval-1.280/},
  researchr = {https://researchr.org/publication/Pham-HongC20},
  cites = {0},
  citedby = {0},
  pages = {2111-2116},
  booktitle = {Proceedings of the Fourteenth Workshop on Semantic Evaluation, SemEval@COLING 2020, Barcelona (online), December 12-13, 2020},
  editor = {Aurélie Herbelot and Xiaodan Zhu and Alexis Palmer and Nathan Schneider and Jonathan May and Ekaterina Shutova},
  publisher = {International Committee for Computational Linguistics},
  isbn = {978-1-952148-31-6},
}