Contrastive Visual Semantic Pretraining Magnifies the Semantics of Natural Language Representations

Robert Wolfe, Aylin Caliskan. Contrastive Visual Semantic Pretraining Magnifies the Semantics of Natural Language Representations. In Smaranda Muresan, Preslav Nakov, Aline Villavicencio, editors, Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2022, Dublin, Ireland, May 22-27, 2022. pages 3050-3061, Association for Computational Linguistics, 2022. [doi]

Abstract

Abstract is missing.