Boost Transformer-based Language Models with GPU-Friendly Sparsity and Quantization

Chong Yu, Tao Chen, Zhongxue Gan. Boost Transformer-based Language Models with GPU-Friendly Sparsity and Quantization. In Anna Rogers, Jordan L. Boyd-Graber, Naoaki Okazaki, editors, Findings of the Association for Computational Linguistics: ACL 2023, Toronto, Canada, July 9-14, 2023. pages 218-235, Association for Computational Linguistics, 2023. [doi]

Abstract

Abstract is missing.