BERTRAM: Improved Word Embeddings Have Big Impact on Contextualized Model Performance

Timo Schick, Hinrich Schütze. BERTRAM: Improved Word Embeddings Have Big Impact on Contextualized Model Performance. In Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel R. Tetreault, editors, Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020, Online, July 5-10, 2020. pages 3996-4007, Association for Computational Linguistics, 2020. [doi]

@inproceedings{SchickS20-0,
  title = {BERTRAM: Improved Word Embeddings Have Big Impact on Contextualized Model Performance},
  author = {Timo Schick and Hinrich Schütze},
  year = {2020},
  url = {https://www.aclweb.org/anthology/2020.acl-main.368/},
  researchr = {https://researchr.org/publication/SchickS20-0},
  cites = {0},
  citedby = {0},
  pages = {3996-4007},
  booktitle = {Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020, Online, July 5-10, 2020},
  editor = {Dan Jurafsky and Joyce Chai and Natalie Schluter and Joel R. Tetreault},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-952148-25-5},
}