Multiformer: A Head-Configurable Transformer-Based Model for Direct Speech Translation

Gerard Sant, Gerard I. Gállego, Belen Alastruey, Marta Ruiz Costa-Jussà. Multiformer: A Head-Configurable Transformer-Based Model for Direct Speech Translation. In Daphne Ippolito, Liunian Harold Li, Maria Leonor Pacheco, Danqi Chen, Nianwen Xue, editors, Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop, NAACL-HLT 2022, Hybrid Event / Seattle, WA, USA, July 10-15, 2022. pages 277-284, Association for Computational Linguistics, 2022. [doi]

@inproceedings{SantGAC22,
  title = {Multiformer: A Head-Configurable Transformer-Based Model for Direct Speech Translation},
  author = {Gerard Sant and Gerard I. Gállego and Belen Alastruey and Marta Ruiz Costa-Jussà},
  year = {2022},
  url = {https://aclanthology.org/2022.naacl-srw.34},
  researchr = {https://researchr.org/publication/SantGAC22},
  cites = {0},
  citedby = {0},
  pages = {277-284},
  booktitle = {Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop, NAACL-HLT 2022, Hybrid Event / Seattle, WA, USA, July 10-15, 2022},
  editor = {Daphne Ippolito and Liunian Harold Li and Maria Leonor Pacheco and Danqi Chen and Nianwen Xue},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-955917-73-5},
}