On Explore-Then-Commit Strategies

  title={On Explore-Then-Commit Strategies},
  author={Aur{\'e}lien Garivier and Emilie Kaufmann and Tor Lattimore},
We study the problem of minimising regret in two-armed bandit problems with Gaussian rewards. Our objective is to use this simple setting to illustrate that strategies based on an exploration phase (up to a stopping time) followed by exploitation are necessarily suboptimal. The results hold regardless of whether or not the difference in means between the two arms is known. Besides the main message, we also refine existing deviation inequalities, which allow us to design fully sequential… CONTINUE READING
Highly Cited
This paper has 17 citations. REVIEW CITATIONS
Recent Discussions
This paper has been referenced on Twitter 3 times over the past 90 days. VIEW TWEETS

From This Paper

Topics from this paper.
12 Citations
21 References
Similar Papers

Similar Papers

Loading similar papers…