Transformer-XL: Attentive Language Models beyond a Fixed-Length Context

Zihang Dai, Zhilin Yang, Yiming Yang, Jaime G. Carbonell, Quoc Viet Le, Ruslan Salakhutdinov. Transformer-XL: Attentive Language Models beyond a Fixed-Length Context. In Anna Korhonen, David R. Traum, Lluís Màrquez, editors, Proceedings of the 57th Conference of the Association for Computational Linguistics, ACL 2019, Florence, Italy, July 28- August 2, 2019, Volume 1: Long Papers. pages 2978-2988, Association for Computational Linguistics, 2019. [doi]

@inproceedings{DaiYYCLS19,
  title = {Transformer-XL: Attentive Language Models beyond a Fixed-Length Context},
  author = {Zihang Dai and Zhilin Yang and Yiming Yang and Jaime G. Carbonell and Quoc Viet Le and Ruslan Salakhutdinov},
  year = {2019},
  url = {https://www.aclweb.org/anthology/P19-1285/},
  researchr = {https://researchr.org/publication/DaiYYCLS19},
  cites = {0},
  citedby = {0},
  pages = {2978-2988},
  booktitle = {Proceedings of the 57th Conference of the Association for Computational Linguistics, ACL 2019, Florence, Italy, July 28- August 2, 2019, Volume 1: Long Papers},
  editor = {Anna Korhonen and David R. Traum and Lluís Màrquez},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-950737-48-2},
}