Transformer Working Memory Enables Regular Language Reasoning And Natural Language Length Extrapolation

Ta-Chung Chi, Ting-Han Fan, Alexander Rudnicky, Peter J. Ramadge. Transformer Working Memory Enables Regular Language Reasoning And Natural Language Length Extrapolation. In Houda Bouamor, Juan Pino 0001, Kalika Bali, editors, Findings of the Association for Computational Linguistics: EMNLP 2023, Singapore, December 6-10, 2023. pages 5972-5984, Association for Computational Linguistics, 2023. [doi]

Authors

Ta-Chung Chi

This author has not been identified. Look up 'Ta-Chung Chi' in Google

Ting-Han Fan

This author has not been identified. Look up 'Ting-Han Fan' in Google

Alexander Rudnicky

This author has not been identified. Look up 'Alexander Rudnicky' in Google

Peter J. Ramadge

This author has not been identified. Look up 'Peter J. Ramadge' in Google