Grgur Kovac, Jérémy Perez, Rémy Portelas, Peter Ford Dominey, Pierre-Yves Oudeyer. Recursive Training Loops in LLMs: How training data properties modulate distribution shift in generated data?. In Christos Christodoulopoulos 0001, Tanmoy Chakraborty 0002, Carolyn Rose, Violet Peng, editors, Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, EMNLP 2025, Suzhou, China, November 4-9, 2025. pages 32290-32309, Association for Computational Linguistics, 2025. [doi]
Abstract is missing.