PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models

Zhuocheng Gong, Jiahao Liu, Qifan Wang, Yang Yang, Jingang Wang, Wei Wu 0014, Yunsen Xian, Dongyan Zhao 0001, Rui Yan 0001. PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models. In Anna Rogers, Jordan L. Boyd-Graber, Naoaki Okazaki, editors, Findings of the Association for Computational Linguistics: ACL 2023, Toronto, Canada, July 9-14, 2023. pages 8065-8079, Association for Computational Linguistics, 2023. [doi]

@inproceedings{GongLWYW0X0023,
  title = {PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models},
  author = {Zhuocheng Gong and Jiahao Liu and Qifan Wang and Yang Yang and Jingang Wang and Wei Wu 0014 and Yunsen Xian and Dongyan Zhao 0001 and Rui Yan 0001},
  year = {2023},
  url = {https://aclanthology.org/2023.findings-acl.511},
  researchr = {https://researchr.org/publication/GongLWYW0X0023},
  cites = {0},
  citedby = {0},
  pages = {8065-8079},
  booktitle = {Findings of the Association for Computational Linguistics: ACL 2023, Toronto, Canada, July 9-14, 2023},
  editor = {Anna Rogers and Jordan L. Boyd-Graber and Naoaki Okazaki},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-959429-62-3},
}