Transformer-BLS: An efficient learning algorithm based on multi-head attention mechanism and incremental learning algorithms

Rongrong Fu, Haifeng Liang, Shiwei Wang, Chengcheng Jia, Guangbin Sun, Tengfei Gao, Dan Chen 0001, Yaodong Wang. Transformer-BLS: An efficient learning algorithm based on multi-head attention mechanism and incremental learning algorithms. Expert Syst. Appl., 238(Part A):121734, March 2024. [doi]

@article{FuLWJSGCW24,
  title = {Transformer-BLS: An efficient learning algorithm based on multi-head attention mechanism and incremental learning algorithms},
  author = {Rongrong Fu and Haifeng Liang and Shiwei Wang and Chengcheng Jia and Guangbin Sun and Tengfei Gao and Dan Chen 0001 and Yaodong Wang},
  year = {2024},
  month = {March},
  doi = {10.1016/j.eswa.2023.121734},
  url = {https://doi.org/10.1016/j.eswa.2023.121734},
  researchr = {https://researchr.org/publication/FuLWJSGCW24},
  cites = {0},
  citedby = {0},
  journal = {Expert Syst. Appl.},
  volume = {238},
  number = {Part A},
  pages = {121734},
}