TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models

Makoto Shing, Kou Misaki, Han Bao, Sho Yokoi, Takuya Akiba. TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models. In The Thirteenth International Conference on Learning Representations, ICLR 2025, Singapore, April 24-28, 2025. OpenReview.net, 2025. [doi]

@inproceedings{ShingMBYA25,
  title = {TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models},
  author = {Makoto Shing and Kou Misaki and Han Bao and Sho Yokoi and Takuya Akiba},
  year = {2025},
  url = {https://openreview.net/forum?id=cqsw28DuMW},
  researchr = {https://researchr.org/publication/ShingMBYA25},
  cites = {0},
  citedby = {0},
  booktitle = {The Thirteenth International Conference on Learning Representations, ICLR 2025, Singapore, April 24-28, 2025},
  publisher = {OpenReview.net},
}