Dragos D. Margineantu

Learn More
Many machine learning applications require classiiers that minimize an asymmetric cost function rather than the misclassiication rate, and several recent papers have addressed this problem. However, these papers have either applied no statistical testing or have applied statistical methods that are not appropriate for the cost-sensitive setting. Without(More)
For many classification tasks a large number of instances available for training are unlabeled and the cost associated with the labeling process varies over the input space. Meanwhile, virtually all these problems require classifiers that minimize a non-uniform loss function associated with the classification decisions (rather than the accuracy or number of(More)
Decision tree models typically give good classification decisions but poor probability estimates. In many applications, it is important to have good probability estimates as well. This paper introduces a new algorithm, Bagged Lazy Option Trees (BLOTs), for constructing decision trees and compares it to an alternative, Bagged Probability Estimation Trees(More)
Inverse reinforcement learning (IRL) techniques (Ng & Russell, 2000) provide a foundation for detecting abnormal agent behavior and predicting agent intent through estimating its reward function. Unfortunately, IRL algorithms suffer from the large dimensionality of the reward function space. Meanwhile, most applications that can benefit from an IRL-based(More)
For many applications, data mining systems are required to detect anomalous (abnormal, unmodeled, or unexpected) observations. This has so far proven to be a difficult challenge because anomalies are usually considered to be "non-normal" observations, where "normality" is typically defined by very complex concepts. Because of these and other reasons, there(More)