# On Efficiently Explaining Graph-Based Classifiers

@inproceedings{Huang2021OnEE, title={On Efficiently Explaining Graph-Based Classifiers}, author={Xuanxiang Huang and Yacine Izza and Alexey Ignatiev and Joao Marques-Silva}, booktitle={KR}, year={2021} }

Recent work has shown that not only decision trees (DTs) may not be interpretable but also proposed a polynomial-time algorithm for computing one PI-explanation of a DT.
This paper shows that for a wide range of classifiers, globally referred to as decision graphs, and which include decision trees and binary decision diagrams, but also their multi-valued variants, there exist polynomial-time algorithms for computing one PI-explanation. In addition, the paper also proposes a polynomial-time…

## 4 Citations

On Deciding Feature Membership in Explanations of SDD&Related Classifiers

- Computer Science
- 2022

The paper proves that any classifier for which an explanation can be computed in polynomial time, then deciding feature membership in an explanationCan be decided with one NP oracle call and proposes propositional encodings for classifiers represented with Sentential Decision Diagrams and for other related propositional languages.

On the Explanatory Power of Decision Trees

- Computer ScienceArXiv
- 2021

It is proved that the set of all sufficient reasons of minimal size for an instance given a decision tree can be exponentially larger than the size of the input (the instance and the decision tree) and generating the full set of sufficient reasons can be out of reach.

On the Computation of Necessary and Sufficient Explanations

- Computer ScienceArXiv
- 2022

This paper justifies this terminology semantically and shows that necessary reasons correspond to what is known as contrastive explanations and provides an algorithm which can enumerate their shortest necessary reasons in output poly- nomial time.

Explanations for Monotonic Classifiers

- Computer ScienceICML
- 2021

Novel algorithms for the computation of one formal explanation of a (black-box) monotonic classifier are described, polynomial in the run time complexity of the classifier and the number of features.

## References

SHOWING 1-10 OF 92 REFERENCES

Using the minimum description length principle to infer reduced ordered decision graphs

- Computer ScienceMachine Learning
- 2004

This work proposes a local optimization algorithm that generates compact decision graphs by performing local changes in an existing graph until a minimum is reached and uses Rissanen's minimum description length principle to control the tradeoff between accuracy in the training set and complexity of the description.

SAT-Based Rigorous Explanations for Decision Lists

- Computer ScienceSAT
- 2021

This paper shows that computing explanations for DLs is computationally hard, and proposes propositional encodings for computing abductive explanations and contrastive explanations of DLs and investigates the practical efficiency of a MARCO-like approach for enumerating explanations.

On Explaining Random Forests with SAT

- Computer ScienceIJCAI
- 2021

The paper proposes a propositional encoding for computing explanations of RFs, thus enabling finding PI-explanations with a SAT solver, and demonstrates that the proposed SAT-based approach significantly outperforms existing heuristic approaches.

Optimizing Binary Decision Diagrams for Interpretable Machine Learning Classification

- Computer Science2021 Design, Automation & Test in Europe Conference & Exhibition (DATE)
- 2021

This work proposes preliminary inroads in two main directions, proposing a SAT-based model for computing a decision tree as the smallest Reduced Ordered Binary Decision Diagram, consistent with given training data, and exploring heuristic approaches for deriving sub-optimal ROBDDs, in order to improve the scalability of the proposed technique.

Abduction-Based Explanations for Machine Learning Models

- Computer ScienceAAAI
- 2019

A constraint-agnostic solution for computing explanations for any ML model that exploits abductive reasoning, and imposes the requirement that the ML model can be represented as sets of constraints using some target constraint reasoning system for which the decision problem can be answered with some oracle.

Bayesian Networks and Decision Graphs

- Computer ScienceStatistics for Engineering and Information Science
- 2001

The book introduces probabilistic graphical models and decision graphs, including Bayesian networks and influence diagrams, and presents a thorough introduction to state-of-the-art solution and analysis algorithms.

A Symbolic Approach to Explaining Bayesian Network Classifiers

- Computer ScienceIJCAI
- 2018

We propose an approach for explaining Bayesian network classifiers, which is based on compiling such classifiers into decision functions that have a tractable and symbolic form. We introduce two…

Optimal classification trees

- Computer ScienceMachine Learning
- 2017

Optimal classification trees are presented, a novel formulation of the decision tree problem using modern MIO techniques that yields the optimal decision tree for axes-aligned splits and synthetic tests demonstrate that these methods recover the true decision tree more closely than heuristics, refuting the notion that optimal methods overfit the training data.

On Tractable XAI Queries based on Compiled Representations

- Computer ScienceKR
- 2020

This paper defines new explanation and/or verification queries about classifiers and shows how they can be addressed by combining queries and transformations about the associated Boolean circuits.

Compiling Bayesian Network Classifiers into Decision Graphs

- Computer ScienceAAAI
- 2019

An algorithm is proposed for compiling Bayesian network classifiers into decision graphs that mimic the input and output behavior of the classifiers, which are tractable and can be exponentially smaller in size than decision trees.