The following publications are possibly variants of this publication:
- A High Energy-Efficiency Inference Accelerator Exploiting Sparse CNNsNing Li. iccai 2020: 534-541 [doi]
- AdaS: A Fast and Energy-Efficient CNN Accelerator Exploiting Bit-SparsityXiaolong Lin, Gang Li 0015, Zizhao Liu, Yadong Liu, Fan Zhang, Zhuoran Song, Naifeng Jing, Xiaoyao Liang. dac 2023: 1-6 [doi]
- SqueezeFlow: A Sparse CNN Accelerator Exploiting Concise Convolution RulesJiajun Li, Shuhao Jiang, Shijun Gong, Jingya Wu, Junchao Yan, Guihai Yan, Xiaowei Li 0001. TC, 68(11):1663-1677, 2019. [doi]
- High PE Utilization CNN Accelerator with Channel Fusion Supporting Pattern-Compressed Sparse Neural NetworksJingyu Wang, Songming Yu, Jinshan Yue, Zhe Yuan, Zhuqing Yuan, Huazhong Yang, Xueqing Li, Yongpan Liu. dac 2020: 1-6 [doi]
- Split DNN Inference for Exploiting Near-Edge AcceleratorsHao Liu, Mohammed E. Fouda, Ahmed M. Eltawil, Suhaib A. Fahmy. edge 2024: 84-91 [doi]
- A None-Sparse Inference Accelerator that Distills and Reuses the Computation Redundancy in CNNsYing Wang, Shengwen Liang, Huawei Li, Xiaowei Li. dac 2019: 202 [doi]