The following publications are possibly variants of this publication:
- JointContrast: Skeleton-Based Interaction Recognition with New Representation and Contrastive LearningJi Zhang, Xiangze Jia, Zhen Wang, Yonglong Luo, Fulong Chen 0002, Gaoming Yang, Lihui Zhao. algorithms, 16(4):190, April 2023. [doi]
- Graph Contrastive Learning for Skeleton-based Action RecognitionXiaohu Huang, Hao Zhou, Jian Wang, Haocheng Feng, Junyu Han, Errui Ding, Jingdong Wang 0001, Xinggang Wang, Wenyu Liu 0001, Bin Feng 0001. iclr 2023: [doi]
- Global-local contrastive multiview representation learning for skeleton-based action recognitionCunling Bian, Wei Feng 0005, Fanbo Meng, Song Wang 0002. cviu, 229:103655, March 2023. [doi]
- Efficient Spatio-Temporal Contrastive Learning for Skeleton-Based 3-D Action RecognitionXuehao Gao, Yang Yang 0066, Yimeng Zhang, Maosen Li, Jin-Gang Yu, Shaoyi Du. tmm, 25:405-417, 2023. [doi]
- Cross-stream contrastive learning for self-supervised skeleton-based action recognitionDing Li 0006, Yongqiang Tang, Zhizhong Zhang, Wensheng Zhang 0002. ivc, 135:104689, 2023. [doi]
- Hierarchical Consistent Contrastive Learning for Skeleton-Based Action Recognition with Growing AugmentationsJiahang Zhang, Lilang Lin, Jiaying Liu 0001. AAAI 2023: 3427-3435 [doi]
- Actionlet-Dependent Contrastive Learning for Unsupervised Skeleton-Based Action RecognitionLilang Lin, Jiahang Zhang, Jiaying Liu 0001. cvpr 2023: 2363-2372 [doi]
- Contrast-Reconstruction Representation Learning for Self-Supervised Skeleton-Based Action RecognitionPeng Wang, Jun Wen, Chenyang Si, Yuntao Qian, Liang Wang 0001. TIP, 31:6224-6238, 2022. [doi]
- Learning Representations by Contrastive Spatio-Temporal Clustering for Skeleton-Based Action RecognitionMingdao Wang, Xueming Li, Siqi Chen, Xianlin Zhang, Lei Ma, Yue Zhang 0016. tmm, 26:3207-3220, 2024. [doi]
- Augmented Skeleton Based Contrastive Action Learning with Momentum LSTM for Unsupervised Action RecognitionHaocong Rao, Shihao Xu, Xiping Hu 0001, Jun Cheng 0002, Bin Hu 0001. isci, 569:90-109, 2021. [doi]