Corpus ID: 232075801

Beyond Perturbation Stability: LP Recovery Guarantees for MAP Inference on Noisy Stable Instances

@inproceedings{Lang2021BeyondPS,
  title={Beyond Perturbation Stability: LP Recovery Guarantees for MAP Inference on Noisy Stable Instances},
  author={Hunter Lang and A. Reddy and D. Sontag and Aravindan Vijayaraghavan},
  booktitle={AISTATS},
  year={2021}
}
Several works have shown that perturbation stable instances of the MAP inference problem in Potts models can be solved exactly using a natural linear programming (LP) relaxation. However, most of these works give few (or no) guarantees for the LP solutions on instances that do not satisfy the relatively strict perturbation stability definitions. In this work, we go beyond these stability results by showing that the LP approximately recovers the MAP solution of a stable instance even after the… Expand

Figures and Tables from this paper

Graph Cuts Always Find a Global Optimum for Potts Models (With a Catch)

References

SHOWING 1-10 OF 39 REFERENCES
Block Stability for MAP Inference
Optimality of Approximate Inference Algorithms on Stable Instances
Maximum persistency via iterative relaxed inference with graphical models
Partial Optimality by Pruning for MAP-Inference with General Graphical Models
Algorithms for stable and perturbation-resilient problems
Fixing Max-Product: Convergent Message Passing Algorithms for MAP LP-Relaxations
Approximate inference in graphical models using lp relaxations
Train and Test Tightness of LP Relaxations in Structured Prediction
MAP estimation via agreement on trees: message-passing and linear programming
...
1
2
3
4
...