The following publications are possibly variants of this publication:
- Improving Transformer-Based Speech Recognition with Unsupervised Pre-Training and Multi-Task Semantic Knowledge LearningSong Li, Lin Li, Qingyang Hong, Lingling Liu. interspeech 2020: 5006-5010 [doi]
- Hierarchical Transformer: Unsupervised Representation Learning for Skeleton-Based Human Action RecognitionYi-Bin Cheng, Xipeng Chen, Junhong Chen, Pengxu Wei, Dongyu Zhang, Liang Lin. icmcs 2021: 1-6 [doi]
- RAPT: Pre-training of Time-Aware Transformer for Learning Robust Healthcare RepresentationHouxing Ren, Jingyuan Wang, Wayne Xin Zhao, Ning Wu. kdd 2021: 3503-3511 [doi]
- UGTransformer: Unsupervised Graph Transformer Representation LearningLixiang Xu, Haifeng Liu, Qingzhe Cui, Bin Luo, Ning Li, Yan Chen, Yuanyan Tang. ijcnn 2023: 1-8 [doi]
- Log-based Anomaly Detection Without Log ParsingVan-Hoang Le, Hongyu Zhang. 2021:
- Unsupervised Extractive Summarization by Pre-training Hierarchical TransformersShusheng Xu, Xingxing Zhang, Yi Wu, Furu Wei, Ming Zhou. emnlp 2020: 1784-1795 [doi]