# On Multiset Selection With Size Constraints

@inproceedings{Qian2018OnMS, title={On Multiset Selection With Size Constraints}, author={Chao Qian and Yibo Zhang and Ke Tang and Xin Yao}, booktitle={AAAI}, year={2018} }

This paper considers the multiset selection problem with size constraints, which arises in many real-world applications such as budget allocation. Previous studies required the objective function f to be submodular, while we relax this assumption by introducing the notion of the submodularity ratios (denoted by α_f and β_f). We propose an anytime randomized iterative approach POMS, which maximizes the given objective f and minimizes the multiset size simultaneously. We prove that POMS using…

## 12 Citations

Maximizing DR-submodular+supermodular functions on the integer lattice subject to a cardinality constraint

- Mathematics, Computer ScienceJ. Glob. Optim.
- 2021

This work introduces a decreasing threshold greedy algorithm with a binary search as its subroutine to solve the problem of maximizing the sum of a monotone non-negative diminishing return submodular (DR-submodular) function and a supermodular function on the integer lattice subject to a cardinality constraint.

Sequence Selection by Pareto Optimization

- Computer Science, MathematicsIJCAI
- 2018

It is proved that for any previously studied objective function, POSeqSel using a reasonable time can always reach or improve the best known approximation guarantee.

Multi-objective Evolutionary Algorithms are Generally Good: Maximizing Monotone Submodular Functions over Sequences

- Computer ScienceArXiv
- 2021

It is proved that for each kind of previously studied monotone submodular objective functions over sequences, a simple multi-objective EA, i.e., GSEMO, can always reach or improve the best known approximation guarantee after running polynomial time in expectation.

Multiobjective Evolutionary Algorithms Are Still Good: Maximizing Monotone Approximately Submodular Minus Modular Functions

- Mathematics, Computer ScienceEvolutionary Computation
- 2021

It is proved that by optimizing the original objective function (g-c) and the size simultaneously, the GSEMO fails to achieve a good polynomial-time approximation guarantee, but it is also proven that by optimized a distorted objective function and thesize simultaneous, theGSEMO can still achieve the best-known polynometric- time approximation guarantee.

From Sets to Multisets: Provable Variational Inference for Probabilistic Integer Submodular Models

- Computer Science, MathematicsICML
- 2020

This work proposes the Generalized Multilinear Extension, a continuous DR-submodular extension for integer submodular functions, and forms a new probabilistic model which is defined through integer sub modular functions.

Maximizing Monotone DR-submodular Continuous Functions by Derivative-free Optimization

- Computer Science, MathematicsArXiv
- 2018

This paper proposes a derivative-free algorithm LDGM for the first time, and proves that LDGM can achieve a $(1-e^{-\beta}-\epsilon)-approximation guarantee after $O(1/\ep silon)$ iterations, which is the same as the best previous gradient-based algorithm.

Efficient Algorithms for Monotone Non-Submodular Maximization with Partition Matroid Constraint

- Computer ScienceIJCAI
- 2022

This work focuses on leveraging properties of partition matroid constraint to propose algorithms with theoretical bound and efficient query complexity and provide better analysis on theoretical performance guarantee of some existing techniques.

Fast Maximization of Non-Submodular, Monotonic Functions on the Integer Lattice

- Computer Science, MathematicsICML
- 2018

This work provides two approximation algorithms for maximizing a non-submodular function on the integer lattice subject to a cardinality constraint; these are the first algorithms for this purpose that have polynomial query complexity.

Theoretical analyses of multi-objective evolutionary algorithms on multi-modal objectives: (hot-off-the-press track at GECCO 2021)

- Computer ScienceAAAI
- 2021

The OneJumpZeroJump problem is proposed, a bi-objective problem with single objectives isomorphic to the classic jump function benchmark, and it is proved that the simple evolutionary multi-objectives optimizer (SEMO) cannot compute the full Pareto front.

Unsupervised Feature Selection by Pareto Optimization

- Computer ScienceAAAI
- 2019

This paper proposes an anytime randomized iterative approach POCSS, which minimizes the reconstruction error and the number of selected features simultaneously and exhibits the superior performance of PocSS over the state-of-the-art algorithms.

## References

SHOWING 1-10 OF 22 REFERENCES

Optimal Budget Allocation: Theoretical Guarantee and Efficient Algorithm

- Computer Science, MathematicsICML
- 2014

This framework includes Alon et al.'s model, even with a competitor and with cost, and gives a faster (1 - 1/e)-approximation algorithm, which allows us to implement the authors' algorithm up to almost 10M edges.

Constrained Monotone $k$ -Submodular Function Maximization Using Multiobjective Evolutionary Algorithms With Theoretical Guarantee

- Computer ScienceIEEE Transactions on Evolutionary Computation
- 2018

This paper proposes a new approach which employs a multiobjective evolutionary algorithm to maximize the given objective and minimize the size simultaneously and proves that the proposed method can obtain the asymptotically tight approximation guarantee.

Maximizing Submodular Functions under Matroid Constraints by Evolutionary Algorithms

- Mathematics, Computer ScienceEvolutionary Computation
- 2015

This paper investigates the runtime of a simple single objective evolutionary algorithm called () EA and a multiobjective evolutionary algorithms called GSEMO until they have obtained a good approximation for submodular functions.

Submodular Optimization with Routing Constraints

- BusinessAAAI
- 2016

This work proposes a generalized cost-benefit (GCB) greedy algorithm for the submodular optimization problem, and proves bi-criterion approximation guarantees under significantly weaker assumptions than those in related literature.

Online Submodular Welfare Maximization: Greedy is Optimal

- Computer Science, EconomicsSODA
- 2013

It is proved that no online algorithm (even randomized, against an oblivious adversary) is better than 1/2-competitive for welfare maximization with coverage valuations, unless NP = RP, which proves that Greedy provides the optimal competitive ratio.

Maximizing Monotone Submodular Functions over the Integer Lattice

- Mathematics, Computer ScienceIPCO
- 2016

This paper designs polynomial-timeapproximation algorithms for a cardinality constraint, a polymatroid constraint, and a knapsack constraint for functions defined over the integer lattice.

A Generalization of Submodular Cover via the Diminishing Return Property on the Integer Lattice

- Computer Science, MathematicsNIPS
- 2015

This work considers a generalization of the submodular cover problem based on the concept of diminishing return property on the integer lattice and devise a bicriteria approximation algorithm that is guaranteed to output a log-factor approximate solution that satisfies the constraints with the desired accuracy.

SIMPATH: An Efficient Algorithm for Influence Maximization under the Linear Threshold Model

- Computer Science2011 IEEE 11th International Conference on Data Mining
- 2011

This paper proposes Simpath, an efficient and effective algorithm for influence maximization under the linear threshold model that addresses these drawbacks by incorporating several clever optimizations, and shows that Simpath consistently outperforms the state of the art w.r.t. running time, memory consumption and the quality of the seed set chosen.

Guaranteed Non-convex Optimization: Submodular Maximization over Continuous Domains

- Computer Science, MathematicsAISTATS
- 2017

The weak DR property is introduced that gives a unified characterization of submodularity for all set, integer-lattice and continuous functions and for maximizing monotone DR-submodular continuous functions under general down-closed convex constraints, a Frank-Wolfe variant with approximation guarantee, and sub-linear convergence rate are proposed.

Subset Selection under Noise

- Computer ScienceNIPS
- 2017

It is proved that PONSS can achieve a better approximation ratio under some assumption such as i.i.d. noise distribution, and the empirical results on influence maximization and sparse regression problems show the superior performance of P ONSS.