MATE: Multi-view Attention for Table Transformer Efficiency

Julian Eisenschlos, Maharshi Gor, Thomas Müller 0009, William W. Cohen. MATE: Multi-view Attention for Table Transformer Efficiency. In Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih, editors, Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021, Virtual Event / Punta Cana, Dominican Republic, 7-11 November, 2021. pages 7606-7619, Association for Computational Linguistics, 2021. [doi]

Abstract

Abstract is missing.