# Dynamic Decision Boundary for One-class Classifiers applied to non-uniformly Sampled Data

@article{Grassa2020DynamicDB,
title={Dynamic Decision Boundary for One-class Classifiers applied to non-uniformly Sampled Data},
author={Riccardo La Grassa and Ignazio Gallo and Nicola Landro},
journal={2020 Digital Image Computing: Techniques and Applications (DICTA)},
year={2020},
pages={1-7}
}
• Published 5 April 2020
• Computer Science
• 2020 Digital Image Computing: Techniques and Applications (DICTA)
A typical issue in Pattern Recognition is the nonuniformly sampled data, which modifies the general performance and capability of machine learning algorithms to make accurate predictions. Generally, the data is considered non-uniformly sampled when in a specific area of data space, they are not enough, leading us to misclassification problems. This issue cut down the goal of the one-class classifiers decreasing their performance. In this paper, we propose a one-class classifier based on the…
2 Citations

## Figures and Tables from this paper

$\sigma^2$R Loss: a Weighted Loss by Multiplicative Factors using Sigmoidal Functions
• Computer Science
• 2020
A new loss function is introduced called sigma squared reduction loss ( 2R loss), which is regulated by a sigmoid function to inflate/deflate the error per instance and then continue to reduce the intra-class variance.

## References

SHOWING 1-10 OF 26 REFERENCES
Binary Classification using Pairs of Minimum Spanning Trees or N-ary Trees
• Computer Science
CAIP
• 2019
Three methods are proposed which leverage on the combination of one-class classifiers based on non-parametric models, N-ary Trees and Minimum Spanning Trees class descriptors (MST-CD), to tackle binary classification problems.
Minimum spanning tree based one-class classifier
• Computer Science
Neurocomputing
• 2009
Deep learning with support vector data description
• Computer Science
Neurocomputing
• 2015
A Hybrid Semi-Supervised Anomaly Detection Model for High-Dimensional Data
• Computer Science
Comput. Intell. Neurosci.
• 2017
A hybrid semi-supervised anomaly detection model for high-dimensional data that consists of a deep autoencoder (DAE) and an ensemble k-nearest neighbor graphs- (K-NNG-) based anomaly detector is proposed.
A Classification Methodology Based on Subspace Graphs Learning
• Computer Science
2019 Digital Image Computing: Techniques and Applications (DICTA)
• 2019
This paper proposes a design methodology for one-class classifiers using an ensemble-of-classifiers approach that takes the best classifier, partitioning the area near a pattern into γγ−2 sub-spaces and combining all possible spanning trees that can be created starting from γ nodes.
Anomaly Detection using One-Class Neural Networks
• Computer Science
ArXiv
• 2018
A comprehensive set of experiments demonstrate that on complex data sets (like CIFAR and PFAM), OC-NN significantly outperforms existing state-of-the-art anomaly detection methods.
Learning Deep Features for One-Class Classification
• Computer Science
IEEE Transactions on Image Processing
• 2019
A novel deep-learning-based approach for one-class transfer learning in which labeled data from an unrelated task is used for feature learning in one- class classification and achieves significant improvements over the state-of-the-art.
Multi-modality in one-class classification
• Computer Science
WWW '10
• 2010
The social network of actors which is implicit in a large body of electronic communication is extracted and turned into valuable features for classifying the exchanged documents and allows for broader applicability when positive and negative items are not naturally separable.