XNOR Neural Engine: A Hardware Accelerator IP for 21.6-fJ/op Binary Neural Network Inference

Francesco Conti 0001, Pasquale Davide Schiavone, Luca Benini. XNOR Neural Engine: A Hardware Accelerator IP for 21.6-fJ/op Binary Neural Network Inference. IEEE Trans. on CAD of Integrated Circuits and Systems, 37(11):2940-2951, 2018. [doi]

References

No references recorded for this publication.

Cited by

No citations of this publication recorded.