Self-Distillation into Self-Attention Heads for Improving Transformer-based End-to-End Neural Speaker Diarization

Ye-Rin Jeoung, Jeong Hwan Choi, Ju-Seok Seong, Jehyun Kyung, Joon-Hyuk Chang. Self-Distillation into Self-Attention Heads for Improving Transformer-based End-to-End Neural Speaker Diarization. In Naomi Harte, Julie Carson-Berndsen, Gareth Jones, editors, 24th Annual Conference of the International Speech Communication Association, Interspeech 2023, Dublin, Ireland, August 20-24, 2023. pages 3197-3201, ISCA, 2023. [doi]

Abstract

Abstract is missing.