Corpus ID: 13961213

Learning Selective Sum-Product Networks

@inproceedings{Peharz2014LearningSS,
  title={Learning Selective Sum-Product Networks},
  author={Robert Peharz and Robert Gens and Pedro M. Domingos},
  year={2014}
}
We consider the selectivity constraint on the structure of sum-product networks (SPNs), which allows each sum node to have at most one child with non-zero output for each possible input. This allows us to find globally optimal maximum likelihood parameters in closed form. Although being a constrained class of SPNs, these models still strictly generalize classical graphical models such as Bayesian networks. Closed form parameter estimation opens the door for structure learning using a principled… Expand
Structure Inference in Sum-Product Networks using Infinite Sum-Product Trees
Sum-Product Networks (SPNs) are a highly efficient type of a deep probabilistic model that allows exact inference in time linear in the size of the network. In previous work, several heuristicExpand
Bayesian Learning of Sum-Product Networks
TLDR
A well-principled Bayesian framework for SPN structure learning, which consistently and robustly learns SPN structures under missing data, and a natural parametrisation for an important and widely used special case of SPNs. Expand
Towards Scalable and Robust Sum-Product Networks
TLDR
This work proposes the addition of caches to the SPN nodes and shows how this memoisation technique reduces inference times in a range of experiments, and introduces class-selective SPNs, an architecture that is suited for classification tasks and enables efficient robustness computation in Credal SPNs. Expand
Parameter and Structure Learning Techniques for Sum Product Networks
TLDR
A new Bayesian moment matching (BMM) algorithm to learn the parameters for SPNs generatively and a discriminative learning algorithm based on the Extended BaumWelch (EBW) algorithm is presented. Expand
Learning the Structure of Sum-Product Networks via an SVD-based Algorithm
TLDR
Two new structure learning algorithms for sum-product networks, in the generative and discriminative settings, that are based on recursively extracting rank-one submatrices from data are presented. Expand
On the Sample Complexity of Learning Sum-Product Networks
TLDR
This work shows that the sample complexity of learning tree structured SPNs with the usual type of leaves grows at most linearly (up to logarithmic factors) with the number of parameters of the SPN, and obtains the upper bounds based on the recently proposed notion of distribution compression schemes. Expand
Robust Analysis of MAP Inference in Selective Sum-Product Networks
TLDR
This work addresses the problem of assessing the robustness of MAP inferences produced with Selective SPNs to global perturbations of the parameters, and presents efficient algorithms and an empirical analysis with realistic problems. Expand
Cascading Sum-Product Networks using Robustness
TLDR
This work builds a hierarchical approach where the classification task is deferred to another model if the outcome is deemed unreliable, and experiments show that the robustness measure can be a meaningful manner to improve classification accuracy. Expand
Sum-product networks: A survey
TLDR
A survey of SPNs, including their definition, the main algorithms for inference and learning from data, several applications, a brief review of software libraries, and a comparison with related models are offered. Expand
Robustifying sum-product networks
TLDR
The experiments show that the use of credal sum-product networks allow us to distinguish between reliable and unreliable classifications with higher accuracy than standard approaches based on (precise) probability values. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 20 REFERENCES
Learning the Structure of Sum-Product Networks
TLDR
This work proposes the first algorithm for learning the structure of SPNs that takes full advantage of their expressiveness, and shows that the learned SPNs are typically comparable to graphical models in likelihood but superior in inference speed and accuracy. Expand
Greedy Part-Wise Learning of Sum-Product Networks
TLDR
This paper introduces a novel algorithm, learning the structure and parameters of sum-product networks in a greedy bottom-up manner, that iteratively merges probabilistic models of small variable scope to larger and more complex models. Expand
Learning Sum-Product Networks with Direct and Indirect Variable Interactions
TLDR
ID-SPN is presented, a new algorithm for learning SPN structure that unifies the combination of direct and indirect interactions that leads to significantly better accuracy than several state-of-the-art algorithms forlearning SPNs and other tractable models. Expand
Sum-product networks: A new deep architecture
The key limiting factor in graphical model inference and learning is the complexity of the partition function. We thus ask the question: what are the most general conditions under which the partitionExpand
Online Incremental Structure Learning of Sum-Product Networks
TLDR
A new online incremental structure learning method for SPNs that achieves the performance of batch structure learning and the experimental results show that the proposed method outperforms the online version of the previous method. Expand
Learning the Architecture of Sum-Product Networks Using Clustering on Variables
TLDR
Experimental evidence shows that learning the SPN architecture significantly improves its performance compared to using a previously-proposed static architecture. Expand
Learning Arithmetic Circuits
TLDR
This work learns arithmetic circuits with a penalty on the number of edges in the circuit (in which the cost of inference is linear) that is equivalent to learning a Bayesian network with context-specific independence by greedily splitting conditional distributions. Expand
A Bayesian Approach to Learning Bayesian Networks with Local Structure
TLDR
A Bayesian approach to learning Bayesian networks that contain the more general decision-graph representations of the CPDs is investigated, and how to evaluate the posterior probability-- that is, the Bayesian score--of such a network, given a database of observed cases is described. Expand
Context-Specific Independence in Bayesian Networks
TLDR
This paper proposes a formal notion of context-specific independence (CSI), based on regularities in the conditional probability tables (CPTs) at a node, and proposes a technique, analogous to (and based on) d-separation, for determining when such independence holds in a given network. Expand
A differential approach to inference in Bayesian networks
TLDR
The proposed framework for inference subsumes one of the most influential methods for inference in Bayesian networks, known as the tree-clustering or jointree method, which provides a deeper understanding of this classical method and lifts its desirable characteristics to a much more general setting. Expand
...
1
2
...