Criterion Full name Author MI Mutual Information Maximisation Various (1970s ) MIFS Mutual Information Feature Selection Battiti (1994) JMI Joint Mutual Information Yang & Moody (1999) MIFS-U… (More)
Ensemble approaches to classification and regression have attracted a great deal of interest in recent years. These methods can be shown both theoretically and empirically to outperform single… (More)
We describe the results of a study on a heuristic technique that claimed to effectively balance diversity against individual accuracy between members of a neural network regression ensemble. We… (More)
Learning in adversarial settings is becoming an important task for application domains where attackers may inject malicious data into the training set to subvert normal operation of data-driven… (More)
Although diversity in classifier ensembles is desirable, its relationship with the ensemble accuracy is not straightforward. Here we derive a decomposition of the majority vote error into three… (More)
We study the issue of error diversity in ensembles of neural networks. In ensembles of regression estimators, the measurement of diversity can be formalised as the Bias-VarianceCovariance… (More)
Thread-Level Speculation (TLS) facilitates the extraction of parallel threads from sequential applications. Most prior work has focused on developing the compiler and architecture for this execution… (More)
We introduce Learn.MF, an ensemble-of-classifiers based algorithm that employs random subspace selection to address the missing feature problem in supervised classification. Unlike most established… (More)
Fundamental nano-patterns are simple, static, binary properties of Java methods, such as ObjectCreator and Recursive. We present a provisional catalogue of 17 such nano-patterns. We report… (More)
This paper examines the benefits that information theory can bring to the study of multiple classifier systems. We discuss relationships between the mutual information and the classification error of… (More)