PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models

Zhuocheng Gong, Jiahao Liu, Qifan Wang, Yang Yang, Jingang Wang, Wei Wu 0014, Yunsen Xian, Dongyan Zhao 0001, Rui Yan 0001. PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models. In Anna Rogers, Jordan L. Boyd-Graber, Naoaki Okazaki, editors, Findings of the Association for Computational Linguistics: ACL 2023, Toronto, Canada, July 9-14, 2023. pages 8065-8079, Association for Computational Linguistics, 2023. [doi]

Abstract

Abstract is missing.