XNOR Neural Engine: A Hardware Accelerator IP for 21.6-fJ/op Binary Neural Network Inference

Francesco Conti 0001, Pasquale Davide Schiavone, Luca Benini. XNOR Neural Engine: A Hardware Accelerator IP for 21.6-fJ/op Binary Neural Network Inference. IEEE Trans. on CAD of Integrated Circuits and Systems, 37(11):2940-2951, 2018. [doi]

@article{ContiSB18,
  title = {XNOR Neural Engine: A Hardware Accelerator IP for 21.6-fJ/op Binary Neural Network Inference},
  author = {Francesco Conti 0001 and Pasquale Davide Schiavone and Luca Benini},
  year = {2018},
  doi = {10.1109/TCAD.2018.2857019},
  url = {https://doi.org/10.1109/TCAD.2018.2857019},
  researchr = {https://researchr.org/publication/ContiSB18},
  cites = {0},
  citedby = {0},
  journal = {IEEE Trans. on CAD of Integrated Circuits and Systems},
  volume = {37},
  number = {11},
  pages = {2940-2951},
}