Clayton D. Scott

Learn More
The Neyman-Pearson (NP) approach to hypothesis testing is useful in situations where different types of error have different consequences or a priori probabilities are unknown. For any /spl alpha/>0, the NP lemma specifies the most powerful test of size /spl alpha/, but assumes the distributions for each hypothesis are known or (in some cases) the(More)
A common approach to determining corresponding points on two shapes is to compute the cost of each possible pairing of points and solve the assignment problem (weighted bipartite matching) for the resulting cost matrix. We consider the problem of solving for point correspondences when the shapes of interest are each defined by a single, closed contour. A(More)
In this paper, we propose a method for robust kernel density estimation. We interpret a KDE with Gaussian kernel as the inner product between a mapped test point and the centroid of mapped training points in kernel feature space. Our robust KDE replaces the centroid with a robust estimate based on M-estimation (P. Huber, 1981), The iteratively re-weighted(More)
Given a probability measure P and a reference measure μ, one is often interested in the minimum μ-measure set with P -measure at least α. Minimum volume sets of this type summarize the regions of greatest probability mass of P , and are useful for detecting anomalies and constructing confidence regions. This paper addresses the problem of estimating minimum(More)
Decision trees are among the most popular types of classifiers, with interpretability and ease of implementation being among their chief attributes. Despite the widespread use of decision trees, theoretical analysis of their performance has only begun to emerge in recent years. In this paper, it is shown that a new family of decision trees, dyadic decision(More)
We study the problem of designing support vector classifiers with respect to a Neyman-Pearson criterion. Specifically, given a user-specified level alpha isin (0,1), how can we ensure a false alarm rate no greater than q while minimizing the miss rate? We examine two approaches, one based on shifting the offset of a conventionally trained SVM and the other(More)
This paper studies the training of support vector machine (SVM) classifiers with respect to the minimax and Neyman-Pearson criteria. In principle, these criteria can be optimized in a straightforward way using a cost-sensitive SVM. In practice, however, because these criteria require especially accurate error estimation, standard techniques for tuning SVM(More)
Hausdorff accurate estimation of density level sets is relevant in applications where a spatially uniform mode of convergence is desired to ensure that the estimated set is close to the target set at all points. The minimax optimal rate of error convergence for the Hausdorff metric is known to be (n/ logn) for level sets with Lipschitz boundaries, where the(More)
Recovering a pattern or image from a collection of noisy and misaligned observations is a challenging problem that arises in image processing and pattern recognition. This paper presents an automatic, wavelet-based approach to this problem. Despite the success of wavelet decompositions in other areas of statistical signal and image processing, most(More)