Plausibility Processing in Transformer Language Models: Focusing on the Role of Attention Heads in GPT

Soo Ryu. Plausibility Processing in Transformer Language Models: Focusing on the Role of Attention Heads in GPT. In Houda Bouamor, Juan Pino 0001, Kalika Bali, editors, Findings of the Association for Computational Linguistics: EMNLP 2023, Singapore, December 6-10, 2023. pages 356-369, Association for Computational Linguistics, 2023. [doi]

@inproceedings{Ryu23-1,
  title = {Plausibility Processing in Transformer Language Models: Focusing on the Role of Attention Heads in GPT},
  author = {Soo Ryu},
  year = {2023},
  url = {https://aclanthology.org/2023.findings-emnlp.27},
  researchr = {https://researchr.org/publication/Ryu23-1},
  cites = {0},
  citedby = {0},
  pages = {356-369},
  booktitle = {Findings of the Association for Computational Linguistics: EMNLP 2023, Singapore, December 6-10, 2023},
  editor = {Houda Bouamor and Juan Pino 0001 and Kalika Bali},
  publisher = {Association for Computational Linguistics},
  isbn = {979-8-89176-061-5},
}