Deep compression and EIE: Efficient inference engine on compressed deep neural network

Song Han, Xingyu Liu, Huizi Mao, Jing Pu, Ardavan Pedram, Mark Horowitz, Bill Dally. Deep compression and EIE: Efficient inference engine on compressed deep neural network. In 2016 IEEE Hot Chips 28 Symposium (HCS), Cupertino, CA, USA, August 21-23, 2016. pages 1-6, IEEE, 2016. [doi]

Abstract

Abstract is missing.