Sentence-T5: Scalable Sentence Encoders from Pre-trained Text-to-Text Models

Jianmo Ni, Gustavo Hernandez Ábrego, Noah Constant, Ji Ma, Keith B. Hall, Daniel Cer, Yinfei Yang. Sentence-T5: Scalable Sentence Encoders from Pre-trained Text-to-Text Models. In Smaranda Muresan, Preslav Nakov, Aline Villavicencio, editors, Findings of the Association for Computational Linguistics: ACL 2022, Dublin, Ireland, May 22-27, 2022. pages 1864-1874, Association for Computational Linguistics, 2022. [doi]

@inproceedings{NiACMHCY22,
  title = {Sentence-T5: Scalable Sentence Encoders from Pre-trained Text-to-Text Models},
  author = {Jianmo Ni and Gustavo Hernandez Ábrego and Noah Constant and Ji Ma and Keith B. Hall and Daniel Cer and Yinfei Yang},
  year = {2022},
  url = {https://aclanthology.org/2022.findings-acl.146},
  researchr = {https://researchr.org/publication/NiACMHCY22},
  cites = {0},
  citedby = {0},
  pages = {1864-1874},
  booktitle = {Findings of the Association for Computational Linguistics: ACL 2022, Dublin, Ireland, May 22-27, 2022},
  editor = {Smaranda Muresan and Preslav Nakov and Aline Villavicencio},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-955917-25-4},
}