VCC: Scaling Transformers to 128K Tokens or More by Prioritizing Important Tokens

Zhanpeng Zeng, Cole Hawkins, Mingyi Hong, Aston Zhang, Nikolaos Pappas 0004, Vikas Singh, Shuai Zheng. VCC: Scaling Transformers to 128K Tokens or More by Prioritizing Important Tokens. In Alice Oh, Tristan Naumann, Amir Globerson, Kate Saenko, Moritz Hardt, Sergey Levine, editors, Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, NeurIPS 2023, New Orleans, LA, USA, December 10 - 16, 2023. 2023. [doi]

Authors

Zhanpeng Zeng

This author has not been identified. Look up 'Zhanpeng Zeng' in Google

Cole Hawkins

This author has not been identified. Look up 'Cole Hawkins' in Google

Mingyi Hong

This author has not been identified. Look up 'Mingyi Hong' in Google

Aston Zhang

This author has not been identified. Look up 'Aston Zhang' in Google

Nikolaos Pappas 0004

This author has not been identified. Look up 'Nikolaos Pappas 0004' in Google

Vikas Singh

This author has not been identified. Look up 'Vikas Singh' in Google

Shuai Zheng

This author has not been identified. Look up 'Shuai Zheng' in Google