Structural Knowledge Distillation: Tractably Distilling Information for Structured Predictor

Xinyu Wang 0013, Yong Jiang, Zhaohui Yan, Zixia Jia, Nguyen Bach, Tao Wang 0056, Zhongqiang Huang, Fei Huang, Kewei Tu. Structural Knowledge Distillation: Tractably Distilling Information for Structured Predictor. In Chengqing Zong, Fei Xia, Wenjie Li 0002, Roberto Navigli, editors, Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL/IJCNLP 2021, (Volume 1: Long Papers), Virtual Event, August 1-6, 2021. pages 550-564, Association for Computational Linguistics, 2021. [doi]

@inproceedings{WangJYJBWHHT20,
  title = {Structural Knowledge Distillation: Tractably Distilling Information for Structured Predictor},
  author = {Xinyu Wang 0013 and Yong Jiang and Zhaohui Yan and Zixia Jia and Nguyen Bach and Tao Wang 0056 and Zhongqiang Huang and Fei Huang and Kewei Tu},
  year = {2021},
  url = {https://aclanthology.org/2021.acl-long.46},
  researchr = {https://researchr.org/publication/WangJYJBWHHT20},
  cites = {0},
  citedby = {0},
  pages = {550-564},
  booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL/IJCNLP 2021, (Volume 1: Long Papers), Virtual Event, August 1-6, 2021},
  editor = {Chengqing Zong and Fei Xia and Wenjie Li 0002 and Roberto Navigli},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-954085-52-7},
}