Block Transformer: Global-to-Local Language Modeling for Fast Inference

Namgyu Ho, Sangmin Bae, Taehyeon Kim 0001, Hyunjik Jo, Yireun Kim, Tal Schuster, Adam Fisch, James Thorne, Se-Young Yun. Block Transformer: Global-to-Local Language Modeling for Fast Inference. In Amir Globersons, Lester Mackey, Danielle Belgrave, Angela Fan, Ulrich Paquet, Jakub M. Tomczak, Cheng Zhang 0005, editors, Advances in Neural Information Processing Systems 38: Annual Conference on Neural Information Processing Systems 2024, NeurIPS 2024, Vancouver, BC, Canada, December 10 - 15, 2024. 2024. [doi]

Abstract

Abstract is missing.