Blockwise Self-Attention for Long Document Understanding

Jiezhong Qiu, Hao Ma, Omer Levy, Wen-tau Yih, Sinong Wang, Jie Tang 0001. Blockwise Self-Attention for Long Document Understanding. In Trevor Cohn, Yulan He, Yang Liu, editors, Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings, EMNLP 2020, Online Event, 16-20 November 2020. pages 2555-2565, Association for Computational Linguistics, 2020. [doi]

Abstract

Abstract is missing.