Fine-Grained Complexity Theory: Conditional Lower Bounds for Computational Geometry

  title={Fine-Grained Complexity Theory: Conditional Lower Bounds for Computational Geometry},
  author={Karl Bringmann},
Fine-grained complexity theory is the area of theoretical computer science that proves conditional lower bounds based on the Strong Exponential Time Hypothesis and similar conjectures. This area has been thriving in the last decade, leading to conditionally best-possible algorithms for a wide variety of problems on graphs, strings, numbers etc. This article is an introduction to fine-grained lower bounds in computational geometry, with a focus on lower bounds for polynomial-time problems based… 
1 Citations
Conditional Lower Bounds for Dynamic Geometric Measure Problems
New polynomial lower bounds are given for a number of dynamic measure problems in computational geometry, including those for counting maximal or extremal points in R3, different variants of Klee’s Measure Problem, and problems related to finding the largest empty disk in a set of points.


Fine-Grained Complexity Theory (Tutorial)
This tutorial gives an introduction to fine-grained complexity theory, which allows to rule out faster algorithms by proving conditional lower bounds via fine- grained reductions from certain key conjectures.
  • V. V. Williams
  • Mathematics
    Proceedings of the International Congress of Mathematicians (ICM 2018)
  • 2019
In recent years, a new “fine-grained” theory of computational hardness has been developed, based on “fine-grained reductions” that focus on exact running times for problems. Mimicking NP-hardness,
Fine-Grained Analysis of Problems on Curves
We provide conditional lower bounds on two problems on polygonal curves. First, we generalize a recent result on the (discrete) Fréchet distance to k curves. Specifically, we show that, assuming the
Improved Approximation for Fréchet Distance on c-Packed Curves Matching Conditional Lower Bounds
An improved algorithm with time complexity 𝒪(cnlog2(1/𝜀)/𝜄 + cnlog n).
SETH vs Approximation
Our story is about hardness of problems in P, but its roots begin with two algorithmic approaches that have been developed to cope with NP-hard problems: approximation algorithms and fasterthan-
Hardness of approximate nearest neighbor search
This work proves conditional near-quadratic running time lower bounds for approximate Bichromatic Closest Pair with Euclidean, Manhattan, Hamming, or edit distance, and construction is the first to yield new hardness results.
Why Walking the Dog Takes Time: Frechet Distance Has No Strongly Subquadratic Algorithms Unless SETH Fails
  • K. Bringmann
  • Mathematics, Computer Science
    2014 IEEE 55th Annual Symposium on Foundations of Computer Science
  • 2014
It is shown that the Fréchet distance cannot be computed in strongly subquadratic time, i.e., in time O(n2-δ) for any delta > 0.001-approximation, which means that finding faster algorithms is as hard as finding faster CNF-SAT algorithms, and the existence of a strongly subaquadratic algorithm can be considered unlikely.
SETH Says: Weak Fréchet Distance is Faster, but only if it is Continuous and in One Dimension
It is shown by reduction from the Orthogonal Vectors problem that algorithms with strongly subquadratic running time cannot approximate the Frechet distance between curves better than a factor 3 unless SETH fails, and an exact algorithm is provided to compute the weak Frechet Distance in linear time.
Tight Hardness Results for Maximum Weight Rectangles
This paper answers the conjecture that this runtime of the Weighted Depth problem is tight up to subpolynomial factors affirmatively by providing a matching conditional lower bound and provides conditional lower bounds for the special case when points are arranged in a grid.
Approximability of the Discrete Fréchet Distance
This paper designs an alpha-approximation algorithm that runs in time O(n log n + n^2 / alpha), for any alpha in [1, n], and presents the first such algorithm by analysing the approximation ratio of a simple, linear-time greedy algorithm to be 2^Theta(n).