The following publications are possibly variants of this publication:
- Accelerating Convolutional Neural Network by Exploiting Sparsity on GPUsWeizhi Xu 0001, Yintai Sun, Shengyu Fan, Hui Yu, Xin Fu. taco, 20(3), September 2023. [doi]
- Exploiting Dynamic Bit Sparsity in Activation for Deep Neural Network AccelerationYongshuai Sun, Mengyu Guo, Dacheng Liang, Shan Tang, Naifeng Jing. asicon 2021: 1-4 [doi]
- Exploiting bit sparsity in both activation and weight in neural networks acceleratorsNaifeng Jing, Zihan Zhang, Yongshuai Sun, Pengyu Liu, Liyan Chen, Qin Wang 0009, Jianfei Jiang 0001. integration, 88:400-409, 2023. [doi]
- Exploiting the input sparsity to accelerate deep neural networks: posterXiao Dong, Lei Liu, Guangli Li, Jiansong Li, Peng Zhao, Xueying Wang, Xiaobing Feng 0002. ppopp 2019: 401-402 [doi]
- Emergence of Shape Bias in Convolutional Neural Networks through Activation SparsityTianqin Li, Ziqi Wen, Yangfan Li, Tai Sing Lee. nips 2023: [doi]