#### Filter Results:

#### Publication Year

2012

2016

#### Publication Type

#### Co-author

#### Key Phrase

#### Publication Venue

#### Data Set Used

Learn More

We present a novel hybrid algorithm for Bayesian network structure learning, called Hybrid HPC (H2PC). It first reconstructs the skeleton of a Bayesian network and then performs a Bayesian-scoring greedy hill-climbing search to orient the edges. It is based on a subrou-tine called HPC, that combines ideas from incremental and divide-and-conquer… (More)

The benefit of exploiting label dependence in multi-label classification is known to be closely dependent on the type of loss to be minimized. In this paper, we show that the subsets of labels that appear as irreducible factors in the factor-ization of the conditional distribution of the label set given the input features play a pivotal role for multi-label… (More)

We present a novel hybrid algorithm for Bayesian network structure learning, called H2PC. It first reconstructs the skeleton of a Bayesian network and then performs a Bayesian-scoring greedy hill-climbing search to orient the edges. The algorithm is based on divide-and-conquer constraint-based subroutines to learn the local structure around a target… (More)

We present a novel hybrid algorithm for Bayesian network structure learning, called H2PC. It first reconstructs the skeleton of a Bayesian network and then performs a Bayesian-scoring greedy hill-climbing search to orient the edges. The algorithm is based on divide-and-conquer constraint-based subroutines to learn the local structure around a target… (More)

We discuss a method to improve the exact F-measure max-imization algorithm called GFM, proposed in [1] for multi-label classification , assuming the label set can be can partitioned into conditionally independent subsets given the input features. If the labels were all independent , the estimation of only m parameters (m denoting the number of labels) would… (More)

We study the problem of decomposing a multivariate probability distribution p(v) defined over a set of random variables V = {V 1 ,. .. , V n } into a product of factors defined over disjoint subsets {V F1 ,. .. , V Fm }. We show that the decomposition of V into irreducible disjoint factors forms a unique partition, which corresponds to the connected… (More)

Summary Causal Bayesian Networks are a special class of Bayesian networks in which the hierarchy directly encodes the causal relationships between the variables. This allows to compute the effect of interventions, which are external changes to the system, caused by e.g. gene knockouts or an administered drug. Whereas numerous packages for constructing… (More)

We explore a practical approach to learn a plausible causal Bayesian network from a combination of non-experimental data and qualitative assumptions that are deemed likely by health experts. The method is based on the incorporation of prior expert knowledge in the form of partial pairwise ordering constraints between variables into a recent constraint-based… (More)

- ‹
- 1
- ›