The Lovász-Softmax Loss: A Tractable Surrogate for the Optimization of the Intersection-Over-Union Measure in Neural Networks

@inproceedings{Berman2018TheLL,
  title={The Lov{\'a}sz-Softmax Loss: A Tractable Surrogate for the Optimization of the Intersection-Over-Union Measure in Neural Networks},
  author={Maxim Berman and Amal Rannen Triki and Matthew B. Blaschko},
  booktitle={CVPR},
  year={2018}
}
Figure A.1 shows segmentations obtained for binary foreground-background segmentation on Pascal VOC under different training losses, after finetuning a base multi-class classification network for a specific class. We see that the Lovász hinge for the Jaccard loss tends to fill gaps in segmentation, recover small objects, and lead to a more sensible segmentation globally. Table A.1 presents detailed scores for this binary segmentation task. We notice a clear improvement of the per image-IoU by… CONTINUE READING

References

Publications referenced by this paper.
Showing 1-10 of 28 references

Deep Learning

  • I. Goodfellow, Y. Bengio, A. Courville
  • MIT Press,
  • 2016
1 Excerpt

Similar Papers

Loading similar papers…