Low-rank and global-representation-key-based attention for graph transformer

Lingping Kong 0001, Varun Ojha 0001, Ruobin Gao, Ponnuthurai Nagaratnam Suganthan, Václav Snásel. Low-rank and global-representation-key-based attention for graph transformer. Inf. Sci., 642:119108, 2023. [doi]

Abstract

Abstract is missing.