The following publications are possibly variants of this publication:
- DeepSave: saving DNN inference during handovers on the edgeWeiyu Ju, Dong Yuan, Wei Bao, Liming Ge, Bing Bing Zhou. edge 2019: 166-178 [doi]
- Dynamic Path Based DNN Synergistic Inference Acceleration in Edge Computing EnvironmentMeng Zhou, Bowen Zhou, Huitian Wang, Fang Dong, Wei Zhao. icpads 2021: 567-574 [doi]
- ADDA: Adaptive Distributed DNN Inference Acceleration in Edge Computing EnvironmentHuitian Wang, Guangxing Cai, Zhaowu Huang, Fang Dong. icpads 2019: 438-445 [doi]
- An adaptive DNN inference acceleration framework with end-edge-cloud collaborative computingGuozhi Liu, Fei Dai 0002, Xiaolong Xu 0001, Xiaodong Fu, Wanchun Dou, Neeraj Kumar, Muhammad Bilal 0003. fgcs, 140:422-435, 2023. [doi]
- Inference Acceleration with Adaptive Distributed DNN Partition over Dynamic Video StreamJin Cao, Bo Li, Mengni Fan, Huiyu Liu. algorithms, 15(7):244, 2022. [doi]
- DNN Surgery: Accelerating DNN Inference on the Edge Through Layer PartitioningHuanghuang Liang, Qianlong Sang, Chuang Hu, Dazhao Cheng, Xiaobo Zhou, Dan Wang, Wei Bao, Yu Wang 0003. tcc, 11(3):3111-3125, July - September 2023. [doi]