#### Filter Results:

#### Publication Year

1991

2007

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

We study Bayesian networks for continuous variables using non-linear conditional density estimators. We demonstrate that useful structures can be extracted from a data set in a self-organized way and we present sampling techniques for belief update based on Markov blanket conditional density models.

In this paper we address the problem of learning the structure in nonlinear Markov networks with continuous variables. Markov networks are well suited to model relationships which do not exhibit a natural causal ordering. We use neural network structures to model the quantitative relationships between variables. Using two data sets we show that interesting… (More)

In a Bayesian framework, we give a principled account of how domain-specific prior knowledge such as imperfect analytic domain theories can be optimally incorporated into networks of locally-tuned units: by choosing a specific architecture and by applying a specific training regimen. Our method proved successful in overcoming the data deficiency problem in… (More)

We present a systematic approach to mean-field theory (MFT) in a general probabilistic setting without assuming a particular model. The mean-field equations derived here may serve as a local, and thus very simple, method for approximate inference in probabilistic models such as Boltzmann machines or Bayesian networks. Our approach is 'model-independent' in… (More)

[ Comment added in October, 2003: This paper is now of mostly historical importance. At the time of publication (1995) it was one of the first machine learning papers to stress the importance of stochastic sampling in time-series prediction and time-series model learning. In this paper we suggested to use Gibbs sampling (Section 4), nowadays particle… (More)

We derive solutions for the problem of missing and noisy data in nonlinear time-series prediction from a probabilistic point of view. We discuss diierent approximations to the solutions, in particular approximations which require either stochastic simulation or the substitution of a single estimate for the missing data. We show experimentally that commonly… (More)

Structure and parameters in a Bayesian network uniquely specify the probability distribution of the modeled domain. The locality of both structure and probabilistic information are the great beneets of Bayesian networks and require the modeler to only specify local information. On the other hand this locality of information might prevent the modeler |and… (More)

- Reimar Hofmann
- 2000

- Reimar Hofmann
- 2000

Bayesian networks have been successfully used to model joint probabilities in many cases. When dealing with continuous variables and nonlinear relationships neural networks can be used to model conditional densities as part of a Bayesian network. However, doing inference can then be com-putationally expensive. Also, information is implicitly passed… (More)