• Publications
  • Influence
Inference and learning in probabilistic logic programs using weighted Boolean formulas
TLDR
This paper investigates how classical inference and learning tasks known from the graphical model community can be tackled for probabilistic logic programs in which some of the facts are annotated with probabilities. Expand
  • 218
  • 28
  • PDF
Lifted Variable Elimination: Decoupling the Operators from the Constraint Language
TLDR
We define operators for lifted inference in terms of relational algebra operators, so that they operate on the semantic level (the constraints' extension) rather than on the syntactic level, making them language-independent. Expand
  • 42
  • 7
  • PDF
Inference in Probabilistic Logic Programs using Weighted CNF's
TLDR
We improve upon the state-of-the-art in probabilistic logic programming by developing efficient inference algorithms for MAP and MAP inference tasks. Expand
  • 80
  • 6
  • PDF
Logical Bayesian Networks and Their Relation to Other Probabilistic Logical Models
TLDR
We introduce Logical Bayesian Networks (LBNs) as another language for knowledge based model construction of Bayesian networks, besides existing languages such as Probabilistic Relational Models and Bayesian Logic Programs. Expand
  • 74
  • 6
  • PDF
Predictive data mining in intensive care
TLDR
In this paper we describe an application of data mining methods in an ICU database for different prediction tasks in an intensive care unit. Expand
  • 34
  • 5
  • PDF
Mining data from intensive care patients
TLDR
In this paper we describe the application of data mining methods for predicting the evolution of patients in an intensive care unit. Expand
  • 86
  • 4
  • PDF
Towards digesting the alphabet-soup of statistical relational learning
TLDR
This paper reports on our work towards the development of a probabilistic logic programming environment intended as a target language in which other Probabilistic languages can be compiled, thereby contributing to the digestion of the “alphabet soup”. Expand
  • 45
  • 4
  • PDF
Instance-level accuracy versus bag-level accuracy in multi-instance learning
TLDR
In multi-instance learning, instances are organized into bags, and a bag is labeled positive if it contains at least one positive instance, and negative otherwise; the labels of individual instances are not given. Expand
  • 26
  • 2
  • PDF
Logical Bayesian networks
  • 18
  • 2
Lifted Variable Elimination with Arbitrary Constraints
TLDR
Lifted probabilistic inference algorithms exploit regularities in the structure of graphical models to perform inference more eciently. Expand
  • 24
  • 2
  • PDF