Distractor Generation based on Text2Text Language Models with Pseudo Kullback-Leibler Divergence Regulation

Hui-Juan Wang, Kai-Yu Hsieh, Han-Cheng Yu, Jui-Ching Tsou, Yu-An Shih, Chen-Hua Huang, Yao-Chung Fan. Distractor Generation based on Text2Text Language Models with Pseudo Kullback-Leibler Divergence Regulation. In Anna Rogers, Jordan L. Boyd-Graber, Naoaki Okazaki, editors, Findings of the Association for Computational Linguistics: ACL 2023, Toronto, Canada, July 9-14, 2023. pages 12477-12491, Association for Computational Linguistics, 2023. [doi]

Abstract

Abstract is missing.