Bandits with Budgets: Regret Lower Bounds and Optimal Algorithms

Richard Combes, Chong Jiang, Rayadurgam Srikant. Bandits with Budgets: Regret Lower Bounds and Optimal Algorithms. In Bill Lin, Jun Jim Xu, Sudipta Sengupta, Devavrat Shah, editors, Proceedings of the 2015 ACM SIGMETRICS International Conference on Measurement and Modeling of Computer Systems, Portland, OR, USA, June 15-19, 2015. pages 245-257, ACM, 2015. [doi]

Abstract

Abstract is missing.