UoR at SemEval-2021 Task 7: Utilizing Pre-trained DistilBERT Model and Multi-scale CNN for Humor Detection

Zehao Liu, Carl Haines, Huizhi Liang. UoR at SemEval-2021 Task 7: Utilizing Pre-trained DistilBERT Model and Multi-scale CNN for Humor Detection. In Alexis Palmer, Nathan Schneider, Natalie Schluter, Guy Emerson, Aurélie Herbelot, Xiaodan Zhu, editors, Proceedings of the 15th International Workshop on Semantic Evaluation, SemEval@ACL/IJCNLP 2021, Virtual Event / Bangkok, Thailand, August 5-6, 2021. pages 1179-1184, Association for Computational Linguistics, 2021. [doi]

@inproceedings{LiuHL21-3,
  title = {UoR at SemEval-2021 Task 7: Utilizing Pre-trained DistilBERT Model and Multi-scale CNN for Humor Detection},
  author = {Zehao Liu and Carl Haines and Huizhi Liang},
  year = {2021},
  url = {https://aclanthology.org/2021.semeval-1.166},
  researchr = {https://researchr.org/publication/LiuHL21-3},
  cites = {0},
  citedby = {0},
  pages = {1179-1184},
  booktitle = {Proceedings of the 15th International Workshop on Semantic Evaluation, SemEval@ACL/IJCNLP 2021, Virtual Event / Bangkok, Thailand, August 5-6, 2021},
  editor = {Alexis Palmer and Nathan Schneider and Natalie Schluter and Guy Emerson and Aurélie Herbelot and Xiaodan Zhu},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-954085-70-1},
}