• Publications
  • Influence
Most likely heteroscedastic Gaussian process regression
TLDR
This paper presents a novel Gaussian process (GP) approach to regression with input-dependent noise rates. Expand
  • 258
  • 29
  • PDF
Lifted Probabilistic Inference with Counting Formulas
TLDR
We present a new lifted inference algorithm, C-FOVE, that not only handles counting formulas in its input, but also creates counting formulas for use in intermediate potentials. Expand
  • 209
  • 29
  • PDF
Bayesian Logic Programs
TLDR
We introduce a generalization of Bayesian networks, called Bayesian logic programs, to overcome some of the limitations of propositional logic. Expand
  • 265
  • 28
  • PDF
Gradient-based boosting for statistical relational learning: The relational dependency network case
TLDR
We propose to turn the problem into a series of relational function-approximation problems using gradient-based boosting. Expand
  • 124
  • 17
  • PDF
Towards Combining Inductive Logic Programming with Bayesian Networks
TLDR
We present results on combining Inductive Logic Programming (ILP) with Bayesian networks to learn both the qualitative and quantitative components of Bayesian logic programs. Expand
  • 154
  • 15
Probabilistic Inductive Logic Programming
TLDR
We outline three classical settings for inductive logic programming, namely learning from entailment, learning from interpretations, and learning from proofs or traces. Expand
  • 256
  • 13
  • PDF
Counting Belief Propagation
TLDR
We present a new and simple BP algorithm, called counting BP, that exploits additional symmetries not reflected in the graphical structure and hence not exploitable by efficient inference techniques. Expand
  • 168
  • 11
  • PDF
Bayesian Logic Programming: Theory and Tool
TLDR
In recent years, there has been a significant interest in integrating probability theory with first order logic and relational representations [see De Raedt and Kersting, 2003, for an overview]. Expand
  • 172
  • 10
  • PDF
Propagation kernels: efficient graph kernels from propagated information
TLDR
We introduce propagation kernels, a general graph-kernel framework for efficiently measuring the similarity of structured data. Expand
  • 96
  • 10
  • PDF
Logical Hidden Markov Models
TLDR
Logical hidden Markov models (LOHMMs) upgrade traditional hidden MarkOV models to deal with sequences of structured symbols in the form of logical atoms, rather than flat characters. Expand
  • 110
  • 9
  • PDF