LSTMs Can Learn Syntax-Sensitive Dependencies Well, But Modeling Structure Makes Them Better

Stephen Clark, Chris Dyer, Phil Blunsom, Dani Yogatama, Adhiguna Kuncoro, John Hale. LSTMs Can Learn Syntax-Sensitive Dependencies Well, But Modeling Structure Makes Them Better. In Iryna Gurevych, Yusuke Miyao, editors, Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, ACL 2018, Melbourne, Australia, July 15-20, 2018, Volume 1: Long Papers. pages 1426-1436, Association for Computational Linguistics, 2018. [doi]

@inproceedings{ClarkDBYKH18,
  title = {LSTMs Can Learn Syntax-Sensitive Dependencies Well, But Modeling Structure Makes Them Better},
  author = {Stephen Clark and Chris Dyer and Phil Blunsom and Dani Yogatama and Adhiguna Kuncoro and John Hale},
  year = {2018},
  url = {https://aclanthology.info/papers/P18-1132/p18-1132},
  researchr = {https://researchr.org/publication/ClarkDBYKH18},
  cites = {0},
  citedby = {0},
  pages = {1426-1436},
  booktitle = {Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, ACL 2018, Melbourne, Australia, July 15-20, 2018, Volume 1: Long Papers},
  editor = {Iryna Gurevych and Yusuke Miyao},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-948087-32-2},
}