H-Transformer-1D: Fast One-Dimensional Hierarchical Attention for Sequences

Zhenhai Zhu, Radu Soricut. H-Transformer-1D: Fast One-Dimensional Hierarchical Attention for Sequences. In Chengqing Zong, Fei Xia, Wenjie Li 0002, Roberto Navigli, editors, Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL/IJCNLP 2021, (Volume 1: Long Papers), Virtual Event, August 1-6, 2021. pages 3801-3815, Association for Computational Linguistics, 2021. [doi]

@inproceedings{ZhuS20-12,
  title = {H-Transformer-1D: Fast One-Dimensional Hierarchical Attention for Sequences},
  author = {Zhenhai Zhu and Radu Soricut},
  year = {2021},
  url = {https://aclanthology.org/2021.acl-long.294},
  researchr = {https://researchr.org/publication/ZhuS20-12},
  cites = {0},
  citedby = {0},
  pages = {3801-3815},
  booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL/IJCNLP 2021, (Volume 1: Long Papers), Virtual Event, August 1-6, 2021},
  editor = {Chengqing Zong and Fei Xia and Wenjie Li 0002 and Roberto Navigli},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-954085-52-7},
}