Experience-efficient learning in associative bandit problems

Alexander L. Strehl, Chris Mesterharm, Michael L. Littman, Haym Hirsh. Experience-efficient learning in associative bandit problems. In William W. Cohen, Andrew Moore, editors, Machine Learning, Proceedings of the Twenty-Third International Conference (ICML 2006), Pittsburgh, Pennsylvania, USA, June 25-29, 2006. Volume 148 of ACM International Conference Proceeding Series, pages 889-896, ACM, 2006. [doi]

Abstract

Abstract is missing.