Lipschitz normalization for self-attention layers with application to graph neural networks

George Dasoulas, Kevin Scaman, Aladin Virmaux. Lipschitz normalization for self-attention layers with application to graph neural networks. In Marina Meila, Tong Zhang 0001, editors, Proceedings of the 38th International Conference on Machine Learning, ICML 2021, 18-24 July 2021, Virtual Event. Volume 139 of Proceedings of Machine Learning Research, pages 2456-2466, PMLR, 2021. [doi]

Authors

George Dasoulas

This author has not been identified. Look up 'George Dasoulas' in Google

Kevin Scaman

This author has not been identified. Look up 'Kevin Scaman' in Google

Aladin Virmaux

This author has not been identified. Look up 'Aladin Virmaux' in Google