"Why Should I Trust You?": Explaining the Predictions of Any Classifier

@article{Ribeiro2016WhySI,
  title={"Why Should I Trust You?": Explaining the Predictions of Any Classifier},
  author={Marco Tulio Ribeiro and Sameer Singh and Carlos Guestrin},
  journal={Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining},
  year={2016}
}
Despite widespread adoption, machine learning models remain mostly black boxes. [...] Key Method We also propose a method to explain models by presenting representative individual predictions and their explanations in a non-redundant way, framing the task as a submodular optimization problem. We demonstrate the flexibility of these methods by explaining different models for text (e.g. random forests) and image classification (e.g. neural networks). We show the utility of explanations via novel experiments, both…Expand
4,213 Citations
Can I Trust the Explainer? Verifying Post-hoc Explanatory Methods
  • 12
  • Highly Influenced
  • PDF
How Much Can I Trust You? - Quantifying Uncertainties in Explaining Neural Networks
  • 3
  • PDF
Minimalistic Explanations: Capturing the Essence of Decisions
  • 3
  • Highly Influenced
  • PDF
Right for the Right Reasons: Training Differentiable Models by Constraining their Explanations
  • 202
  • PDF
A Unified Approach to Interpreting Model Predictions
  • 2,230
  • Highly Influenced
  • PDF
Explaining the Predictions of Any Image Classifier via Decision Trees
  • 2
  • PDF
Why X rather than Y? Explaining Neural Model' Predictions by Generating Intervention Counterfactual Samples
  • Highly Influenced
Why Should I Trust This Item? Explaining the Recommendations of any Model
  • 1
...
1
2
3
4
5
...

References

SHOWING 1-3 OF 3 REFERENCES
How to Explain Individual Classification Decisions
  • 563
  • Highly Influential
  • PDF
Intelligible Models for HealthCare: Predicting Pneumonia Risk and Hospital 30-day Readmission
  • 718
  • Highly Influential
  • PDF
Going deeper with convolutions
  • 22,845
  • Highly Influential
  • PDF