Learn More
Many problems require repeated inference on proba-bilistic graphical models, with different values for evidence variables or other changes. Examples of such problems include utility maximization, MAP inference, online and interactive inference, parameter and structure learning, and dynamic inference. Since small changes to the evidence typically only affect(More)
Lifting can greatly reduce the cost of inference on first-order probabilistic models, but constructing the lifted network can itself be quite costly. In addition, the minimal lifted network is often very close in size to the fully propositionalized model; lifted inference yields little or no speedup in these situations. In this paper, we address both these(More)
Many AI applications need to explicitly represent relational structure as well as handle uncertainty. First order probabilis-tic models combine the power of logic and probability to deal with such domains. A naive approach to inference in these models is to propositionalize the whole theory and carry out the inference on the ground network. Lifted inference(More)
Intractable inference has been a major barrier to the wide adoption of statistical relational models. Existing exact methods suffer from a lack of scalability, and approximate methods tend to be unreliable. Sum-product networks (SPNs; Poon and Domingos 2011) are a recently-proposed probabilistic architecture that guarantees tractable exact inference, even(More)
Many first-order probabilistic models can be represented much more compactly using aggregation operations such as counting. While traditional statistical relational representations share factors across sets of interchangeable random variables, representations that explicitly model aggregations also exploit interchange-ability of random variables within(More)
Link mining problems are characterized by high complexity (since linked objects are not statistically independent) and uncertainty (since data is noisy and incomplete). Thus they necessitate a modeling language that is both probabilistic and relational. Markov logic provides this by attaching weights to formulas in first-order logic and viewing them as(More)