#### Filter Results:

#### Publication Year

2007

2010

#### Publication Type

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

When considering the probabilistic approach to neural networks in the framework of statistical pattern recognition we assume approximation of class-conditional probability distributions by finite mixtures of product components. The mixture components can be interpreted as probabilistic neurons in neurophysiological terms and, in this respect, the fixed… (More)

We discuss the problem of overfitting of probabilistic neural networks in the framework of statistical pattern recognition. The proba-bilistic approach to neural networks provides a statistically justified sub-space method of classification. The underlying structural mixture model includes binary structural parameters and can be optimized by EM algorithm in… (More)

Considering the probabilistic approach to neural networks in the framework of statistical pattern recognition we assume approximation of class-conditional probability distributions by finite mixtures of product components. The mixture components can be interpreted as probabilistic neurons in neurophysiological terms and, in this respect, the fixed… (More)

The statistical pattern recognition based on Bayes formula implies the concept of mutually exclusive classes. This assumption is not applicable when we have to identify some non-exclusive properties and therefore it is unnatural in biological neural networks. Considering the framework of probabilistic neural networks we propose statistical identification of… (More)

The EM algorithm has been used repeatedly to identify latent classes in categorical data by estimating finite distribution mixtures of product components. Unfortunately, the underlying mixtures are not uniquely identifiable and, moreover, the estimated mixture parameters are starting-point dependent. For this reason we use the latent class model only to… (More)

- ‹
- 1
- ›