Skip to search formSkip to main content
You are currently offline. Some features of the site may not work correctly.

RSS Bandit

Known as: Bandit (disambiguation) 
RSS Bandit is an open source RSS/Atom aggregator based on the Microsoft .NET framework. It was originally released as a code sample in a series of… Expand
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2018
2018
Implicit feedback, such as user clicks, although abundant in online information service systems, does not provide substantial… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
Is this relevant?
2018
2018
We consider the problem of incentivizing exploration with heterogeneous agents. In this problem, bandit arms provide vector… Expand
Is this relevant?
2015
2015
A key challenge in information retrieval is that of on-line ranker evaluation: determining which one of a finite set of rankers… Expand
  • table 1
  • figure 1
  • figure 2
  • figure 3
  • figure 5
Is this relevant?
2015
2015
A key aim of current research is to create robots that can reliably manipulate objects. However, in many applications, general… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
2014
Highly Cited
2014
This paper proposes a new method for the K-armed dueling bandit problem, a variation on the regular K-armed bandit problem that… Expand
  • table 1
  • figure 1
  • figure 2
Is this relevant?
2014
2014
We study a new type of K-armed bandit problem where the expected return of one arm may depend on the returns of other arms. We… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
Is this relevant?
2012
2012
We propose a learning approach to pre-compute K-armed bandit playing policies by exploiting prior information describing the… Expand
  • figure 1
  • table 1
  • table 2
Is this relevant?
2010
2010
Differential Evolution is a popular powerful optimization algorithm for continuous problems. Part of its efficiency comes from… Expand
  • figure 1
  • figure 2
  • table 1
Is this relevant?
2007
2007
The K-armed bandit problem is a formalization of the explorationversus exploitation dilemma, a well-known issue in… Expand
Is this relevant?
Highly Cited
2005
Highly Cited
2005
The multiarmed bandit is often used as an analogy for the tradeoff between exploration and exploitation in search problems. The… Expand
  • table 1
  • table 2
Is this relevant?