Corpus ID: 220425420

Collapsing Bandits and Their Application to Public Health Interventions

@article{Mate2020CollapsingBA,
  title={Collapsing Bandits and Their Application to Public Health Interventions},
  author={Aditya Mate and J. Killian and H. Xu and A. Perrault and Milind Tambe},
  journal={ArXiv},
  year={2020},
  volume={abs/2007.04432}
}
We propose and study Collpasing Bandits, a new restless multi-armed bandit (RMAB) setting in which each arm follows a binary-state Markovian process with a special structure: when an arm is played, the state is fully observed, thus "collapsing" any uncertainty, but when an arm is passive, no observation is made, thus allowing uncertainty to evolve. The goal is to keep as many arms in the "good" state as possible by planning a limited budget of actions per round. Such Collapsing Bandits are… Expand
Simulation Based Algorithms for Markov Decision Processes and Multi-Action Restless Bandits
...
1
2
...

References

SHOWING 1-10 OF 38 REFERENCES
INDEXABILITY AND OPTIMAL INDEX POLICIES FOR A CLASS OF REINITIALISING RESTLESS BANDITS
  • S. Villar
  • Medicine, Mathematics
  • Probability in the Engineering and Informational Sciences
  • 2015
On the Whittle Index for Restless Multiarmed Hidden Markov Bandits
Restless bandits with controlled restarts: Indexability and computation of Whittle index
Indexability of Restless Bandit Problems and Optimality of Whittle Index for Dynamic Multichannel Access
  • K. Liu, Qing Zhao
  • Computer Science, Mathematics
  • IEEE Transactions on Information Theory
  • 2010
Some indexable families of restless bandit problems
ON AN INDEX POLICY FOR RESTLESS BANDITS
Restless Bandits: Activity Allocation in a Changing World
Restless Poachers: Handling Exploration-Exploitation Tradeoffs in Security Domains
...
1
2
3
4
...