#### Filter Results:

#### Publication Year

1996

2016

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

In this paper we introduce a new method for robust principal component analysis. Classical PCA is based on the empirical covariance matrix of the data and hence it is highly sensitive to outlying observations. In the past, two robust approaches have been developed. The first is based on the eigenvectors of a robust scatter matrix such as the MCD or an… (More)

In extreme value statistics, the extreme value index is a well-known parameter to measure the tail heaviness of a distribution. Pareto-type distributions, with strictly positive extreme value index (or tail index) are considered. The most prominent extreme value methods are constructed on efficient maximum likelihood estima-tors based on specific parametric… (More)

Since MATLAB is very popular in industry and academia, and is frequently used by chemometricians, statisticians, chemists, and engineers, we introduce a MATLAB library of robust statistical methods. Those methods were developed because their classical alternatives produce unreliable results when the data set contains outlying observations. Our toolbox… (More)

When analyzing data, outlying observations cause problems because they may strongly influence the result. Robust statistics aims at detecting the outliers by searching for the model fitted by the majority of the data. We present an overview of several robust methods and outlier detection tools. We discuss robust procedures for univariate, low-dimensional,… (More)

This paper describes the incorporation of seven stand-alone clustering programs into S-PLUS, where they can now be used in a much more flexible way. The original Fortran programs carried out new cluster analysis algorithms introduced in the book of Kaufman and Rousseeuw (1990). These clustering methods were designed to be robust and to accept dissimilarity… (More)

A collection of n hyperplanes in R d forms a hyperplane arrangement. The depth of a point 2 R d is the smallest number of hyperplanes crossed by any ray emanating from. For d = 2 we prove that there always exists a point with depth at least dn=3e. For higher dimensions we conjecture that the maximal depth is at least dn=(d + 1)e. For arrangements in general… (More)

When applying a statistical method in practice it often occurs that some observations deviate from the usual assumptions. However , many classical methods are sensitive to outliers. The goal of robust statistics is to develop methods that are robust against the possibility that one or several unannounced outliers may occur anywhere in the data. These… (More)

MOTIVATION
Principal components analysis (PCA) is a very popular dimension reduction technique that is widely used as a first step in the analysis of high-dimensional microarray data. However, the classical approach that is based on the mean and the sample covariance matrix of the data is very sensitive to outliers. Also, classification methods based on… (More)

Deepest regression (DR) is a method for linear regression introduced by Rousseeuw and Hubert 16]. The DR method is defined as the fit with largest regression depth relative to the data. In this paper we show that DR is a robust method, with breakdown value that converges almost surely to 1/3 in any dimension. We construct an approximate algorithm for fast… (More)

Recent results about the robustness of kernel methods involve the analysis of influence functions. By definition the influence function is closely related to leave-one-out criteria. In statistical learning , the latter is often used to assess the generalization of a method. In statistics, the influence function is used in a similar way to analyze the… (More)