XNOR Neural Engine: A Hardware Accelerator IP for 21.6-fJ/op Binary Neural Network Inference

Francesco Conti 0001, Pasquale Davide Schiavone, Luca Benini. XNOR Neural Engine: A Hardware Accelerator IP for 21.6-fJ/op Binary Neural Network Inference. IEEE Trans. on CAD of Integrated Circuits and Systems, 37(11):2940-2951, 2018. [doi]

Abstract

Abstract is missing.