Learn More
In a Bayesian framework, we give a principled account of how domain-specific prior knowledge such as imperfect analytic domain theories can be optimally incorporated into networks of locally-tuned units: by choosing a specific architecture and by applying a specific training regimen. Our method proved successful in overcoming the data deficiency problem in(More)
We present a systematic approach to mean-field theory (MFT) in a general probabilistic setting without assuming a particular model. The mean-field equations derived here may serve as a local, and thus very simple, method for approximate inference in probabilistic models such as Boltzmann machines or Bayesian networks. Our approach is 'model-independent' in(More)
We derive solutions for the problem of missing and noisy data in nonlinear time-series prediction from a probabilistic point of view. We discuss diierent approximations to the solutions, in particular approximations which require either stochastic simulation or the substitution of a single estimate for the missing data. We show experimentally that commonly(More)
Bayesian networks have been successfully used to model joint probabilities in many cases. When dealing with continuous variables and nonlinear relationships neural networks can be used to model conditional densities as part of a Bayesian network. However, doing inference can then be com-putationally expensive. Also, information is implicitly passed(More)