Symbolic Semantic Memory in Transformer Language Models

Robert Morain, Kenneth Vargas, Dan Ventura. Symbolic Semantic Memory in Transformer Language Models. In M. Arif Wani, Mehmed M. Kantardzic, Vasile Palade, Daniel Neagu, Longzhi Yang, Kit Yan Chan, editors, 21st IEEE International Conference on Machine Learning and Applications, ICMLA 2022, Nassau, Bahamas, December 12-14, 2022. pages 992-998, IEEE, 2022. [doi]

@inproceedings{MorainVV22,
  title = {Symbolic Semantic Memory in Transformer Language Models},
  author = {Robert Morain and Kenneth Vargas and Dan Ventura},
  year = {2022},
  doi = {10.1109/ICMLA55696.2022.00166},
  url = {https://doi.org/10.1109/ICMLA55696.2022.00166},
  researchr = {https://researchr.org/publication/MorainVV22},
  cites = {0},
  citedby = {0},
  pages = {992-998},
  booktitle = {21st IEEE International Conference on Machine Learning and Applications, ICMLA 2022, Nassau, Bahamas, December 12-14, 2022},
  editor = {M. Arif Wani and Mehmed M. Kantardzic and Vasile Palade and Daniel Neagu and Longzhi Yang and Kit Yan Chan},
  publisher = {IEEE},
  isbn = {978-1-6654-6283-9},
}