Joel Ratsaby

Learn More
1 INTRODUCTION We investigate the tradeoff between labeled The classical problem of learning a classification rule and unlabeled sample complexities in learning can be stated as follows: patterns from classes " 1 " and a classification rule for a parametric two-class " 2 " (or " states of nature ") appear with probabilities problem. In the problem(More)
1 Introduction One of the main problems in machine learning and statistical inference is selecting an appropriate model by which a set of data can be explained. In the absense of any structured prior information aa to the data generating mechanism, one is often forced to consider a range of models, attempting to select the model which best explains the(More)
In this paper we present a new type of binary classifier defined on the unit cube. This classifier combines some of the aspects of the standard methods that have been used in the logical analysis of data (LAD) and geometric classifiers, with a nearest-neighbor paradigm. We assess the predictive performance of the new classifier in learning from a sample,(More)
In [3], the notion of sample width for binary classifiers mapping from the real line was introduced, and it was shown that the performance of such classifiers could be quantified in terms of this quantity. This paper considers how to generalize the notion of sample width so that we can apply it where the classifiers map from some finite metric space. By(More)