ET-BERT: A Contextualized Datagram Representation with Pre-training Transformers for Encrypted Traffic Classification

Xinjie Lin, Gang Xiong, Gaopeng Gou, Zhen Li, Junzheng Shi, Jing Yu. ET-BERT: A Contextualized Datagram Representation with Pre-training Transformers for Encrypted Traffic Classification. In Frédérique Laforest, Raphaël Troncy, Elena Simperl, Deepak Agarwal, Aristides Gionis, Ivan Herman, Lionel Médini, editors, WWW '22: The ACM Web Conference 2022, Virtual Event, Lyon, France, April 25 - 29, 2022. pages 633-642, ACM, 2022. [doi]

@inproceedings{LinXGLSY22,
  title = {ET-BERT: A Contextualized Datagram Representation with Pre-training Transformers for Encrypted Traffic Classification},
  author = {Xinjie Lin and Gang Xiong and Gaopeng Gou and Zhen Li and Junzheng Shi and Jing Yu},
  year = {2022},
  doi = {10.1145/3485447.3512217},
  url = {https://doi.org/10.1145/3485447.3512217},
  researchr = {https://researchr.org/publication/LinXGLSY22},
  cites = {0},
  citedby = {0},
  pages = {633-642},
  booktitle = {WWW '22: The ACM Web Conference 2022, Virtual Event, Lyon, France, April 25 - 29, 2022},
  editor = {Frédérique Laforest and Raphaël Troncy and Elena Simperl and Deepak Agarwal and Aristides Gionis and Ivan Herman and Lionel Médini},
  publisher = {ACM},
  isbn = {978-1-4503-9096-5},
}