Unveiling Internal Reasoning Modes in LLMs: A Deep Dive into Latent Reasoning vs. Factual Shortcuts with Attribute Rate Ratio

Yiran Yang, Haifeng Sun 0001, Jingyu Wang, Qi Qi, Zirui Zhuang, Huazheng Wang, Pengfei Ren 0001, Jing Wang, Jianxin Liao. Unveiling Internal Reasoning Modes in LLMs: A Deep Dive into Latent Reasoning vs. Factual Shortcuts with Attribute Rate Ratio. In Christos Christodoulopoulos 0001, Tanmoy Chakraborty 0002, Carolyn Rose, Violet Peng, editors, Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, EMNLP 2025, Suzhou, China, November 4-9, 2025. pages 2186-2206, Association for Computational Linguistics, 2025. [doi]

Abstract

Abstract is missing.