Scaling Up Dataset Distillation to ImageNet-1K with Constant Memory

Justin Cui, Ruochen Wang, Si Si, Cho-Jui Hsieh. Scaling Up Dataset Distillation to ImageNet-1K with Constant Memory. In Andreas Krause 0001, Emma Brunskill, KyungHyun Cho, Barbara Engelhardt, Sivan Sabato, Jonathan Scarlett, editors, International Conference on Machine Learning, ICML 2023, 23-29 July 2023, Honolulu, Hawaii, USA. Volume 202 of Proceedings of Machine Learning Research, pages 6565-6590, PMLR, 2023. [doi]

Abstract

Abstract is missing.