The following publications are possibly variants of this publication:
- Calibrated Multi-Task LearningFeiping Nie, Zhanxuan Hu, Xuelong Li. kdd 2018: 2012-2021 [doi]
- A Phone-Level Speaker Embedding Extraction Framework with Multi-Gate Mixture-of-Experts Based Multi-Task LearningZhijunyi Yang, Mengjie Du, Rongfeng Su, Xiaokang Liu, Nan Yan, Lan Wang. iscslp 2022: 240-244 [doi]
- MĀ³ViT: Mixture-of-Experts Vision Transformer for Efficient Multi-task Learning with Model-Accelerator Co-designHanxue Liang, Zhiwen Fan, Rishov Sarkar, Ziyu Jiang, Tianlong Chen, Kai Zou, Yu Cheng 0001, Cong Hao, Zhangyang Wang. nips 2022: [doi]
- Expert-Calibrated Learning for Online Optimization with Switching CostsPengfei Li, Jianyi Yang, Shaolei Ren. sigmetrics 2022: 85-86 [doi]
- Expert-Calibrated Learning for Online Optimization with Switching CostsPengfei Li, Jianyi Yang, Shaolei Ren. pomacs, 6(2), 2022. [doi]
- Multi-modal Mixture of Experts Represetation Learning for Sequential RecommendationShuqing Bian, Xingyu Pan, Wayne Xin Zhao, Jinpeng Wang, Chuyuan Wang, Ji-Rong Wen. CIKM 2023: 110-119 [doi]