Myriam Bounhas

Learn More
Naive Bayesian Classifiers, which rely on independence hypotheses, together with a normality assumption to estimate densities for numerical data, are known for their simplicity and their effectiveness. However, estimating densities, even under the normality assumption, may be problematic in case of poor data. In such a situation, possibility distributions(More)
In real-world problems, input data may be pervaded with uncertainty. In this paper, we investigate the behavior of naive possibilistic classifiers, as a counterpart to naive Bayesian ones, for dealing with classification tasks in presence of uncertainty. For this purpose, we extend possibilistic classifiers, which have been recently adapted to numerical(More)
Naive Possibilistic Network Classifiers (NPNC) have been recently used to accomplish the classification task in presence of uncertainty. Because they are mainly based on possibility theory, they run into problems when they are faced with imperfection where the possibility theory is the most convenient tool to represent it. In this paper we investigate to(More)