Transformer Working Memory Enables Regular Language Reasoning And Natural Language Length Extrapolation

Ta-Chung Chi, Ting-Han Fan, Alexander Rudnicky, Peter J. Ramadge. Transformer Working Memory Enables Regular Language Reasoning And Natural Language Length Extrapolation. In Houda Bouamor, Juan Pino 0001, Kalika Bali, editors, Findings of the Association for Computational Linguistics: EMNLP 2023, Singapore, December 6-10, 2023. pages 5972-5984, Association for Computational Linguistics, 2023. [doi]

References

No references recorded for this publication.

Cited by

No citations of this publication recorded.