Dynamic Decision Boundary for One-class Classifiers applied to non-uniformly Sampled Data

@article{Grassa2020DynamicDB,
  title={Dynamic Decision Boundary for One-class Classifiers applied to non-uniformly Sampled Data},
  author={Riccardo La Grassa and Ignazio Gallo and Nicola Landro},
  journal={2020 Digital Image Computing: Techniques and Applications (DICTA)},
  year={2020},
  pages={1-7}
}
A typical issue in Pattern Recognition is the nonuniformly sampled data, which modifies the general performance and capability of machine learning algorithms to make accurate predictions. Generally, the data is considered non-uniformly sampled when in a specific area of data space, they are not enough, leading us to misclassification problems. This issue cut down the goal of the one-class classifiers decreasing their performance. In this paper, we propose a one-class classifier based on the… 
σ2R Loss: a Weighted Loss by Multiplicative Factors using Sigmoidal Functions
$\sigma^2$R Loss: a Weighted Loss by Multiplicative Factors using Sigmoidal Functions
TLDR
A new loss function is introduced called sigma squared reduction loss ( 2R loss), which is regulated by a sigmoid function to inflate/deflate the error per instance and then continue to reduce the intra-class variance.

References

SHOWING 1-10 OF 26 REFERENCES
Modular ensembles for one-class classification based on density analysis
Binary Classification using Pairs of Minimum Spanning Trees or N-ary Trees
TLDR
Three methods are proposed which leverage on the combination of one-class classifiers based on non-parametric models, N-ary Trees and Minimum Spanning Trees class descriptors (MST-CD), to tackle binary classification problems.
Minimum spanning tree based one-class classifier
A Hybrid Semi-Supervised Anomaly Detection Model for High-Dimensional Data
TLDR
A hybrid semi-supervised anomaly detection model for high-dimensional data that consists of a deep autoencoder (DAE) and an ensemble k-nearest neighbor graphs- (K-NNG-) based anomaly detector is proposed.
A Classification Methodology Based on Subspace Graphs Learning
TLDR
This paper proposes a design methodology for one-class classifiers using an ensemble-of-classifiers approach that takes the best classifier, partitioning the area near a pattern into γγ−2 sub-spaces and combining all possible spanning trees that can be created starting from γ nodes.
Anomaly Detection using One-Class Neural Networks
TLDR
A comprehensive set of experiments demonstrate that on complex data sets (like CIFAR and PFAM), OC-NN significantly outperforms existing state-of-the-art anomaly detection methods.
Learning Deep Features for One-Class Classification
TLDR
A novel deep-learning-based approach for one-class transfer learning in which labeled data from an unrelated task is used for feature learning in one- class classification and achieves significant improvements over the state-of-the-art.
Multi-modality in one-class classification
TLDR
The social network of actors which is implicit in a large body of electronic communication is extracted and turned into valuable features for classifying the exchanged documents and allows for broader applicability when positive and negative items are not naturally separable.
...
1
2
3
...