The following publications are possibly variants of this publication:
- Elastic Deep Learning Using Knowledge Distillation with Heterogeneous Computing ResourcesDaxiang Dong, Ji Liu 0003, Xi Wang, Weibao Gong, An Qin 0001, Xingjian Li 0002, Dianhai Yu, Patrick Valduriez, Dejing Dou. europar 2022: 116-128 [doi]
- Online knowledge distillation with elastic peerChao Tan, Jie Li 0002. isci, 583:1-13, 2022. [doi]
- On the correctness of a lock-free compression-based elastic mechanism for a hash trie designMiguel Areias 0001, Ricardo Rocha 0001. computing, 104(10):2279-2305, 2022. [doi]