Dragos D. Margineantu

Learn More
Many machine learning applications require classiiers that minimize an asymmetric cost function rather than the misclassiication rate, and several recent papers have addressed this problem. However, these papers have either applied no statistical testing or have applied statistical methods that are not appropriate for the cost-sensitive setting. Without(More)
For many classification tasks a large number of instances available for training are unlabeled and the cost associated with the labeling process varies over the input space. Meanwhile, virtually all these problems require classifiers that minimize a nonuniform loss function associated with the classification decisions (rather than the accuracy or number of(More)
Decision tree models typically give good classification decisions but poor probability estimates. In many applications, it is important to have good probability estimates as well. This paper introduces a new algorithm, Bagged Lazy Option Trees (B-LOTs), for constructing decision trees and compares it to an alternative, Bagged Probability Estimation Trees(More)
Many machine learning applications require classi ers that minimize an asymmetric loss function rather than the raw misclassi cation rate. We study methods for modifying C4.5 to incorporate arbitrary loss matrices. One way to incorporate loss information into C4.5 is to manipulate the weights assigned to the examples from di erent classes. For 2-class(More)
This paper addresses two cost-sensitive learning methodology issues. First, we ask the question of whether Bagging is always an appropriate procedure to compute accurate class-probability estimates for cost-sensitive classiication. Second, we will point the reader to a potential source of erroneous results in the most common procedure of evaluating(More)
Many machine learning applications require classi ers that minimize an asymmetric cost function rather than the misclassi cation rate, and several recent papers have addressed this problem. However, these papers have either applied no statistical testing or have applied statistical methods that are not appropriate for the cost-sensitive setting. Without(More)
Many machine learning applications require classifiers that minimize an asymmetric loss function rather than the raw misclassification rate. We introduce a wrapper method for data stratification to incorporate arbitrary cost matrices into learning algorithms. One way to implement stratification for C4.5 decision tree learners is to manipulate the weights(More)