Multivariate and functional classification using depth and distance

@article{Hubert2017MultivariateAF,
  title={Multivariate and functional classification using depth and distance},
  author={Mia Hubert and Peter J. Rousseeuw and Pieter Segaert},
  journal={Advances in Data Analysis and Classification},
  year={2017},
  volume={11},
  pages={445-466}
}
We construct classifiers for multivariate and functional data. Our approach is based on a kind of distance between data points and classes. The distance measure needs to be robust to outliers and invariant to linear transformations of the data. For this purpose we can use the bagdistance which is based on halfspace depth. It satisfies most of the properties of a norm but is able to reflect asymmetry when the class is skewed. Alternatively we can compute a measure of outlyingness based on the… 
Depth-weighted Bayes classification
Depth-based Classification for Multivariate Data
TLDR
The present paper is an overview of the research in the field of depth-based classification for multivariate data, providing a short summary of current state of knowledge followed by detailed discussion of four main directions in the depth- based classification, namely semiparametricdepth-based classifiers, maximal depth classifier, (maximal depth) classifiers which use local depth functions and finally advanced depth-Based classifiers.
Localization processes for functional data analysis.
TLDR
This work proposes an alternative to k-nearest neighbors for functional data whereby the approximating neighbor curves are piecewise functions built from a functional sample that uses a locally defined distance function that satisfies stabilization criteria.
Component-wise outlier detection methods for robustifying multivariate functional samples
TLDR
The main application consists in robustifying the reference samples of data, composed by G different known groups to be used, for example, in classification procedures in order to make them more robust.
Automatic feature scaling and selection for support vector machine classification with functional data
TLDR
An embedded feature selection approach for SVM classification is proposed, in which the isotropic Gaussian kernel is modified by associating a bandwidth to each feature, yielding an alternating optimization approach.
On the Optimality of the Max-Depth and Max-Rank Classifiers for Spherical Data
The main goal of supervised learning is to construct a function from labeled training data which assigns arbitrary new data points to one of the labels. Classification tasks may be solved by using
Robust Depth based weighted Estimator with Application in Discriminant Analysis
Received: 10/Mar/2018, Revised: 22/Mar/2018, Accepted: 13/Apr/2018, Online: 30/Jun2018 Abstract— Data depth concept used to measure the deepness of a given point in the entire multivariate data
M-estimators and trimmed means: from Hilbert-valued to fuzzy set-valued data
TLDR
The aim of this paper is to extend M-estimators and trimmed means to p -dimensional fuzzy set-valued data, and to theoretically prove that they inherit robustness from the real settings.
Affine-Invariant Outlier Detection and Data Visualization
TLDR
An open-source data visualization tool based on RSD is developed, and its applications in distribution estimation, outlier detection, and D tolerance-region construction are shown, and some of the desirable properties of RSD are illustrated via comparisons with other similar notions.
...
...

References

SHOWING 1-10 OF 56 REFERENCES
On robust classification using projection depth
This article uses projection depth (PD) for robust classification of multivariate data. Here we consider two types of classifiers, namely, the maximum depth classifier and the modified depth-based
Integrated data depth for smooth functions and its application in supervised classification
TLDR
A modification of the integrated data depth that takes into account the shape properties of the functions is suggested by including a derivative(s) into the definition of the suggested depth measures.
Fast nonparametric classification based on data depth
TLDR
A new procedure, called DDα-procedure, is developed to solve the problem of classifying d-dimensional objects into q ≥ 2 classes using q-dimensional depth plots and a very efficient algorithm for discrimination analysis in the depth space [0,1]q.
Fast DD-classification of functional data
TLDR
A fast nonparametric procedure for classifying functional data that allows to achieve Bayes optimality under standard distributional settings, robust, efficiently computable, and implemented in an R environment is introduced.
On Maximum Depth and Related Classifiers
Abstract.  Over the last couple of decades, data depth has emerged as a powerful exploratory and inferential tool for multivariate data analysis with wide‐spread applications. This paper investigates
Classification of functional data: A segmentation approach
Componentwise classification and clustering of functional data
TLDR
Approaches to adaptively choose components, enabling classification and clustering to be reduced to finite-dimensional problems and to determine regions that are relevant to one of these analyses but not the other.
Comparison between various regression depth methods and the support vector machine to approximate the minimum number of missclassifications
TLDR
Two new approaches to approximating the minimum number of misclassifications achievable with affine hyperplanes are introduced, modifications of the regression depth method proposed by Rousseeuw and Hubert (1999) for linear regression models.
Functional Classification and the Random Tukey Depth. Practical Issues
TLDR
The performance of the random Tukey depth in a real data set is compared with the results obtained with the Lopez-Pintado and Romo depths to analyze its behavior in classification problems.
...
...