Less Is Better: Recovering Intended-Feature Subspace to Robustify NLU Models

Ting Wu, Tao Gui. Less Is Better: Recovering Intended-Feature Subspace to Robustify NLU Models. In Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, YoungGyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na, editors, Proceedings of the 29th International Conference on Computational Linguistics, COLING 2022, Gyeongju, Republic of Korea, October 12-17, 2022. pages 1666-1676, International Committee on Computational Linguistics, 2022. [doi]

@inproceedings{WuG22-4,
  title = {Less Is Better: Recovering Intended-Feature Subspace to Robustify NLU Models},
  author = {Ting Wu and Tao Gui},
  year = {2022},
  url = {https://aclanthology.org/2022.coling-1.143},
  researchr = {https://researchr.org/publication/WuG22-4},
  cites = {0},
  citedby = {0},
  pages = {1666-1676},
  booktitle = {Proceedings of the 29th International Conference on Computational Linguistics, COLING 2022, Gyeongju, Republic of Korea, October 12-17, 2022},
  editor = {Nicoletta Calzolari and Chu-Ren Huang and Hansaem Kim and James Pustejovsky and Leo Wanner and Key-Sun Choi and Pum-Mo Ryu and Hsin-Hsi Chen and Lucia Donatelli and Heng Ji and Sadao Kurohashi and Patrizia Paggio and Nianwen Xue and Seokhwan Kim and YoungGyun Hahm and Zhong He and Tony Kyungil Lee and Enrico Santus and Francis Bond and Seung-Hoon Na},
  publisher = {International Committee on Computational Linguistics},
}