Adaptive Knowledge Distillation Between Text and Speech Pre-Trained Models

Jinjie Ni, Yukun Ma, Wen Wang, Qian Chen 0003, Dianwen Ng, Han Lei, Trung Hieu Nguyen 0001, Chong Zhang 0003, Bin Ma 0001, Erik Cambria. Adaptive Knowledge Distillation Between Text and Speech Pre-Trained Models. In IEEE International Conference on Acoustics, Speech and Signal Processing ICASSP 2023, Rhodes Island, Greece, June 4-10, 2023. pages 1-5, IEEE, 2023. [doi]

Abstract

Abstract is missing.