Unsupervised Boundary-Aware Language Model Pretraining for Chinese Sequence Labeling

Peijie Jiang, Dingkun Long, Yanzhao Zhang, Pengjun Xie, Meishan Zhang, Min Zhang 0005. Unsupervised Boundary-Aware Language Model Pretraining for Chinese Sequence Labeling. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11. pages 526-537, Association for Computational Linguistics, 2022. [doi]

@inproceedings{JiangLZXZ022,
  title = {Unsupervised Boundary-Aware Language Model Pretraining for Chinese Sequence Labeling},
  author = {Peijie Jiang and Dingkun Long and Yanzhao Zhang and Pengjun Xie and Meishan Zhang and Min Zhang 0005},
  year = {2022},
  url = {https://aclanthology.org/2022.emnlp-main.34},
  researchr = {https://researchr.org/publication/JiangLZXZ022},
  cites = {0},
  citedby = {0},
  pages = {526-537},
  booktitle = {Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11},
  editor = {Yoav Goldberg and Zornitsa Kozareva and Yue Zhang},
  publisher = {Association for Computational Linguistics},
}