PLOG: Table-to-Logic Pretraining for Logical Table-to-Text Generation

Ao Liu, Haoyu Dong 0001, Naoaki Okazaki, Shi Han, Dongmei Zhang. PLOG: Table-to-Logic Pretraining for Logical Table-to-Text Generation. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11. pages 5531-5546, Association for Computational Linguistics, 2022. [doi]

@inproceedings{Liu0OHZ22,
  title = {PLOG: Table-to-Logic Pretraining for Logical Table-to-Text Generation},
  author = {Ao Liu and Haoyu Dong 0001 and Naoaki Okazaki and Shi Han and Dongmei Zhang},
  year = {2022},
  url = {https://aclanthology.org/2022.emnlp-main.373},
  researchr = {https://researchr.org/publication/Liu0OHZ22},
  cites = {0},
  citedby = {0},
  pages = {5531-5546},
  booktitle = {Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11},
  editor = {Yoav Goldberg and Zornitsa Kozareva and Yue Zhang},
  publisher = {Association for Computational Linguistics},
}