Transformer Working Memory Enables Regular Language Reasoning And Natural Language Length Extrapolation

Ta-Chung Chi, Ting-Han Fan, Alexander Rudnicky, Peter J. Ramadge. Transformer Working Memory Enables Regular Language Reasoning And Natural Language Length Extrapolation. In Houda Bouamor, Juan Pino 0001, Kalika Bali, editors, Findings of the Association for Computational Linguistics: EMNLP 2023, Singapore, December 6-10, 2023. pages 5972-5984, Association for Computational Linguistics, 2023. [doi]

@inproceedings{ChiFRR23-0,
  title = {Transformer Working Memory Enables Regular Language Reasoning And Natural Language Length Extrapolation},
  author = {Ta-Chung Chi and Ting-Han Fan and Alexander Rudnicky and Peter J. Ramadge},
  year = {2023},
  url = {https://aclanthology.org/2023.findings-emnlp.397},
  researchr = {https://researchr.org/publication/ChiFRR23-0},
  cites = {0},
  citedby = {0},
  pages = {5972-5984},
  booktitle = {Findings of the Association for Computational Linguistics: EMNLP 2023, Singapore, December 6-10, 2023},
  editor = {Houda Bouamor and Juan Pino 0001 and Kalika Bali},
  publisher = {Association for Computational Linguistics},
  isbn = {979-8-89176-061-5},
}