• Corpus ID: 16918495

Fine-grained complexity of integer programming: The case of bounded branch-width and rank

@article{Fomin2016FinegrainedCO,
  title={Fine-grained complexity of integer programming: The case of bounded branch-width and rank},
  author={F. Fomin and Fahad Panolan and M. S. Ramanujan and Saket Saurabh},
  journal={ArXiv},
  year={2016},
  volume={abs/1607.05342}
}
We use the Exponential Time and Strong Exponential Time hypotheses (ETH & SETH) to provide conditional lower bounds on the solvability of the integer programming (IP) problem. We provide evidence that the running times of known pseudo-polynomial time algorithms solving IP, when the number of constraints is a constant [Papadimitriou, J. ACM 1981] and when the branch-width of the corresponding column-matroid is a constant [Cunningham and Geelen, IPCO 2007], are probably optimal. ∗Department of… 

Figures from this paper

Going Beyond Primal Treewidth for (M)ILP
TLDR
This work introduces and algorithmic exploitation of two new decompositional parameters for ILP and MILP, and obtains a full complexity landscape mapping the precise conditions under which incidence treewidth can be used to obtain efficient algorithms.
Proximity Results and Faster Algorithms for Integer Programming Using the Steinitz Lemma
TLDR
The Steinitz lemma is used to show that the ℓ1-distance of an optimal integer and fractional solution, also under the presence of upper bounds on the variables, is bounded by m ⋅ (2,m⋅ Δ +1)m.
Complexity of optimizing over the integers
TLDR
The main merit of this paper is bringing together all of this information under one unifying umbrella with the hope that this will act as yet another catalyst for more interaction across the continuous-discrete divide.
Technical Report Column
TLDR
Improved concrete efficiency and security analysis of Reed-Solomon PCPPs and Randomized query complexity of sabotaged and composed functions, Shalev Ben-David, Robin Kothari, TR16-087.
Proximity results and faster algorithms for Integer Programming using the Steinitz Lemma
TLDR
A lemma of Steinitz is used that states that a set of vectors in $R^m$ that is contained in the unit ball of a norm and that sum up to zero can be ordered such that all partial sums are of norm bounded by $m$.

References

SHOWING 1-10 OF 33 REFERENCES
On Problems as Hard as CNF-SAT
TLDR
It is shown that, for every ϵ <; 1, the problems HITTING SET, SET SPLITTING, and NAE-SAT cannot be computed in time O(2ϵn) unless SETH fails, and it is proved that the fastest known algorithms for STEINTER TREE, CONNECTED VERTEX COVER, SET PARTITIONing, and the pseudo-polynomial time algorithm for SUBSET SUM cannot be significantly improved.
Which problems have strongly exponential complexity?
  • R. Impagliazzo, R. Paturi, F. Zane
  • Computer Science, Mathematics
    Proceedings 39th Annual Symposium on Foundations of Computer Science (Cat. No.98CB36280)
  • 1998
For several NP-complete problems, there have been a progression of better but still exponential algorithms. In this paper we address the relative likelihood of sub-exponential algorithms for these
Tight conditional lower bounds for counting perfect matchings on graphs of bounded treewidth, cliquewidth, and genus
TLDR
It is proved that, assuming the counting version of the Strong Exponential-Time Hypothesis (#SETH), the problem of counting perfect matchings • has no (2 --- e)knO(1) time algorithm for any e > 0 on graphs of treewidth k (but it can be solved in time O(nk+1) if a k-expression is given).
Solving Connectivity Problems Parameterized by Treewidth in Single Exponential Time
TLDR
It is shown that the aforementioned gap cannot be breached for some problems that aim to maximize the number of connected components like Cycle Packing, and in several cases it is able to show that improving those constants would cause the Strong Exponential Time Hypothesis to fail.
Edit Distance Cannot Be Computed in Strongly Subquadratic Time (unless SETH is false)
TLDR
This paper shows that, if the edit distance can be computed in time O(n2-δ) for some constant δ>0, then the satisfiability of conjunctive normal form formulas with N variables and M clauses can be solved in time MO(1) 2(1-ε)N for a constant ε>0.
Hardness of Easy Problems: Basing Hardness on Popular Conjectures such as the Strong Exponential Time Hypothesis (Invited Talk)
TLDR
Evidence is provided that a problem $A$ with a running time O(n^k) that has not been improved in decades, also requires n^{k-o(1)} time, thus explaining the lack of progress on the problem.
Popular Conjectures Imply Strong Lower Bounds for Dynamic Problems
TLDR
It is proved that sufficient progress would imply a breakthrough on one of five major open problems in the theory of algorithms, including dynamic versions of bipartite perfect matching, bipartites maximum weight matching, single source reachability, single sources shortest paths, strong connectivity, subgraph connectivity, diameter approximation and some nongraph problems.
Dynamic Programming and Fast Matrix Multiplication
TLDR
A novel general approach for solving NP-hard optimization problems that combines dynamic programming and fast matrix multiplication is given, which works faster than the usual dynamic programming solution for any vertex subset problem on graphs of bounded branchwidth.
Parameterized Algorithms
TLDR
This comprehensive textbook presents a clean and coherent account of most fundamental tools and techniques in Parameterized Algorithms and is a self-contained guide to the area, providing a toolbox of algorithmic techniques.
Known algorithms on graphs of bounded treewidth are probably optimal
TLDR
Lower bounds on the running time of algorithms solving problems on graphs of bounded treewidth are obtained and the results under the Strong Exponential Time Hypothesis of Impagliazzo and Paturi are proved.
...
1
2
3
4
...