• Corpus ID: 246706260

Online Learning for Min Sum Set Cover and Pandora's Box

  title={Online Learning for Min Sum Set Cover and Pandora's Box},
  author={Evangelia Gergatsouli and Christos Tzamos},
Two central problems in Stochastic Optimization are Min Sum Set Cover and Pandora’s Box . In Pandora’s Box , we are presented with n boxes, each containing an unknown value and the goal is to open the boxes in some order to minimize the sum of the search cost and the smallest value found. Given a distribution of value vectors, we are asked to identify a near-optimal search order. Min Sum Set Cover corresponds to the case where values are either 0 or infinity. In this work, we study the case… 

Contextual Pandora's Box

A no-regret algorithm that performs comparably well to the optimal algorithm which knows all prior distributions exactly, even in the bandit setting where the algorithm never learns the values of the alternatives that were not explored.



Online Pandora's Boxes and Bandits

This work considers online variations of the Pandora’s box problem, a standard model for understanding issues related to the cost of acquiring information for decision-making, and shows that in many scenarios, Pandora can achieve a good approximation to the best possible performance.

Pandora's Box Problem with Order Constraints

It is proved that finding approximately optimal adaptive search strategies is NP-hard when certain matroid constraints are used to further restrict the set of boxes which may be opened, or when the order constraints are given as reachability constraints on a DAG.

Pandora's Box with Correlations: Learning and Approximation

This paper provides the first approximation algorithms for Pandora's Box-type problems with correlations and considers a number of different feasibility constraints and provides simple PA strategies that are approximately optimal with respect to the best PA strategy for each case.

A constant factor approximation algorithm for generalized min-sum set cover

A simple randomized constant factor approximation algorithm is given for the generalized min-sum set cover problem, which is given a universe of elements and a collection of subsets with each set S having a covering requirement.

Approximating Min Sum Set Cover

For the min sum vertex cover version of the problem, it is shown that it can be approximated within a ratio of 2, and is NP-hard to approximate within some constant ρ > 1.

The Pipelined Set Cover Problem

This work uses its linear-programming framework to show that the greedy and local-search algorithms are 4-approximations for pipelined set cover, and extends the analysis to minimize the l P -norm of the costs paid by the sets, where p > 2 is an integer.

Adaptivity Gaps for Stochastic Probing: Submodular and XOS Functions

This paper studies the gap between adaptive and non-adaptive strategies for f being a submodular or a fractionally subadditive (XOS) function, and shows that the adaptivity gap is a constant for monotone andnon-monotone sub modular functions, and logarithmic for XOS functions of small width.

Learning to Branch

It is shown how to use machine learning to determine an optimal weighting of any set of partitioning procedures for the instance distribution at hand using samples from the distribution, and it is proved that this reduction can even be exponential.

Query strategies for priced information (extended abstract)

This work considers a class of problems in which an algorithm seeks to compute a function over a set of inputs, where each input has an associatedprice, and investigates a model for pricing in this framework, constructing a sets of prices for any AND/OR tree that satisfies a very strong type of equilibrium property.

Submodular Stochastic Probing on Matroids

A (1-1/e)/(k_in+k_out+1+1)-approximation algorithm for the case in which k_in greater than 0 matroids as inner constraints and k_out greater than 1 matroIDS as outer constraints is given.