Skip to search form
Skip to main content
Skip to account menu
Semantic Scholar
Semantic Scholar's Logo
Search 226,341,796 papers from all fields of science
Search
Sign In
Create Free Account
Multi-armed bandit
Known as:
N-armed bandit
, Two-armed bandit
, K-armed bandit
Expand
In probability theory, the multi-armed bandit problem (sometimes called the K- or N-armed bandit problem) is a problem in which a gambler at a row of…
Expand
Wikipedia
(opens in a new tab)
Create Alert
Alert
Related topics
Related topics
14 relations
A/B testing
Bayesian optimization
Design of experiments
Greedy algorithm
Expand
Broader (2)
Machine learning
Stochastic optimization
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
2013
2013
Achieving complete learning in Multi-Armed Bandit problems
Sattar Vakili
,
Qing Zhao
Asilomar Conference on Signals, Systems and…
2013
Corpus ID: 5698337
In the classic Multi-Armed Bandit (MAB) problem, there is a given set of arms with unknown reward distributions. At each time, a…
Expand
2012
2012
Group recommendations via multi-armed bandits
José Bento
,
Stratis Ioannidis
,
S. Muthukrishnan
,
Jinyun Yan
The Web Conference
2012
Corpus ID: 116765060
We study recommendations for persistent groups that repeatedly engage in a joint activity. We approach this as a multi-arm bandit…
Expand
2012
2012
A Decision Task in a Social Context: Human Experiments, Models, and Analyses of Behavioral Data
Andrea Nedic
,
Damon Tomlin
,
P. Holmes
,
Deborah A. Prentice
,
J. Cohen
Proceedings of the IEEE
2012
Corpus ID: 14875293
To investigate the influence of information about fellow group members in a constrained decision-making context, we develop four…
Expand
2010
2010
Tug-of-War Model for Multi-armed Bandit Problem
Song-Ju Kim
,
M. Aono
,
M. Hara
International Conference on Unconventional…
2010
Corpus ID: 46529849
We propose a model - the "tug-of-war (TOW) model" - to conduct unique parallel searches using many nonlocally correlated search…
Expand
2010
2010
Decentralized multi-armed bandit with imperfect observations
Keqin Liu
,
Qing Zhao
,
B. Krishnamachari
Allerton Conference on Communication, Control…
2010
Corpus ID: 15426846
We consider decentralized multi-armed bandit problems with multiple distributed players. At each time, each player chooses one of…
Expand
2006
2006
Anytime algorithms for multi-armed bandit problems
Robert D. Kleinberg
ACM-SIAM Symposium on Discrete Algorithms
2006
Corpus ID: 8354051
How should a decision-maker perform repeated choices so as to optimize the average cost or benefit of those choices in the long…
Expand
2005
2005
Information technology project failures: Applying the bandit problem to evaluate managerial decision making
D. Chulkov
,
Mayur S. Desai
Inf. Manag. Comput. Security
2005
Corpus ID: 35001777
Purpose – This paper seeks to apply results from the study of bandit processes to cases of information technology (IT) project…
Expand
1998
1998
a 5 Subunit Alters Desensitization , Pharmacology , Ca 11 Permeability and Ca 11 Modulation of Human Neuronal a 3 Nicotinic Receptors 1
V. Gerzanich
,
Fan Wang
,
A. Kuryatov
,
J. Lindstrom
1998
Corpus ID: 36089274
Functional effects of human a5 nicotinic ACh receptor (AChR) subunits coassembled with a3 and b2 or with a3 and b4 subunits, were…
Expand
Highly Cited
1992
Highly Cited
1992
Association of Intercellular Adhesion Moleculeq (ICAM-1) with Actin-containing Cytoskeleton and -actinin
OUi Carp
,
P. Pallai
,
D. Staunton
,
T. Springer
1992
Corpus ID: 9187895
We have studied the cytoskeletal association of intercellular adhesion molecule-1 (ICAM-1, CD54), an integral membrane protein…
Expand
1983
1983
Extension of the multi-armed bandit problem
P. Varaiya
,
J. Walrand
,
C. Buyukkoc
IEEE Conference on Decision and Control
1983
Corpus ID: 45410437
There are N independent machines. Machine i is described by a sequence {Xi(s), Fi(s)} where xi(s) is the immediate reward and Fi…
Expand
By clicking accept or continuing to use the site, you agree to the terms outlined in our
Privacy Policy
(opens in a new tab)
,
Terms of Service
(opens in a new tab)
, and
Dataset License
(opens in a new tab)
ACCEPT & CONTINUE