Incremental Kernel Null Space Discriminant Analysis for Novelty Detection

  title={Incremental Kernel Null Space Discriminant Analysis for Novelty Detection},
  author={Juncheng Liu and Zhouhui Lian and Yi Wang and J. Xiao},
  journal={2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
Novelty detection, which aims to determine whether a given data belongs to any category of training data or not, is considered to be an important and challenging problem in areas of Pattern Recognition, Machine Learning, etc. Recently, kernel null space method (KNDA) was reported to have state-of-the-art performance in novelty detection. However, KNDA is hard to scale up because of its high computational cost. With the ever-increasing size of data, accelerating the implementing speed of KNDA is… Expand
Novelty Detection and Online Learning for Chunk Data Streams
  • Yi Wang, Yi Ding, +6 authors Jiebo Luo
  • Computer Science, Medicine
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2021
Theoretical analysis and experimental validation on under-sampled and large-scale real-world datasets demonstrate that the proposed algorithms make it possible to learn unlabeled chunk data streams with significantly lower computational costs and comparable accuracies than the state-of-the-art approaches. Expand
Multi-class Novelty Detection Using Mix-up Technique
This work proposes a novel solution using the concept of mix-up technique for novelty detection, termed as Segregation Network, which is trained using only the available known class data and does not need access to any auxiliary dataset or attributes. Expand
Fast Factorization-free Kernel Learning for Unlabeled Chunk Data Streams
A fast factorization-free kernel learning method to unify novelty detection and incremental learning for unlabeled chunk data streams in one framework that constructs a joint reproducing kernel Hilbert space from known class centers by solving a linear system in kernel space. Expand
One-Class Kernel Spectral Regression for Outlier Detection
The paper introduces a new efficient nonlinear one-class classifier formulated as the Rayleigh quotient criterion optimisation. The method, operating in a reproducing kernel Hilbert subspace,Expand
Robust One-Class Kernel Spectral Regression
Through extensive experiments, the proposed methodology is found to enhance robustness against contamination in the training set compared with the baseline kernel null-space method, as well as other existing approaches in the OCC paradigm, while providing the functionality to rank training samples effectively. Expand
Utilizing Patch-Level Category Activation Patterns for Multiple Class Novelty Detection
A novel method that makes deep convolutional neural networks robust to novel classes by combining global and local branch information to train a novelty detection network, which is used during inference to identify novel classes. Expand
Feature extraction from null and non-null spaces of kernel local discriminant embedding
A Two Subspace-based Kernel Local Discriminant Embedding (TSKLDE) method which extract features from both non-null and null space of the within-class locality preserving scatter matrix of LDE in the kernel space is proposed. Expand
Deep Transfer Learning for Multiple Class Novelty Detection
It is shown that thresholding the maximal activation of the proposed network can be used to identify novel objects effectively, and that the proposed method achieves significant improvements over the state-of-the-art methods. Expand
Adversarially Learned One-Class Classifier for Novelty Detection
The results on MNIST and Caltech-256 image datasets, along with the challenging UCSD Ped2 dataset for video anomaly detection illustrate that the proposed method learns the target class effectively and is superior to the baseline and state-of-the-art methods. Expand
One-Class Kernel Spectral Regression
The paper introduces a new efficient nonlinear one-class classifier formulated as the Rayleigh quotient criterion optimisation. The method, operating in a reproducing kernel Hilbert space, minimisesExpand


Kernel Null Space Methods for Novelty Detection
This work presents how to apply a null space method for novelty detection, which maps all training samples of one class to a single point, which outperforms all other methods for multi-class novelty detection. Expand
Kernel Fisher Discriminants for Outlier Detection
The problem of detecting atypical objects or outliers is one of the classical topics in (robust) statistics. Recently, it has been proposed to address this problem by means of one-class SVMExpand
Null space-based kernel Fisher discriminant analysis for face recognition
  • Wei Liu, Yunhong Wang, S. Li, T. Tan
  • Mathematics, Computer Science
  • Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings.
  • 2004
From the theoretical analysis, the NLDA algorithm and the most suitable situation for NLDA are presented and the method is simpler than all other null space approaches, it saves the computational cost and maintains the performance simultaneously. Expand
Fast incremental LDA feature extraction
New algorithms to accelerate the convergence rate of the incremental LDA algorithm given by Chatterjee and Roychowdhury are derived by optimizing the step size in each iteration using steepest descent and conjugate direction methods. Expand
Incremental linear discriminant analysis for classification of data streams
The results show that the proposed ILDA can effectively evolve a discriminant eigenspace over a fast and large data stream, and extract features with superior discriminability in classification, when compared with other methods. Expand
Incremental Kernel Principal Component Analysis
The basis of the proposed solution lies in computing incremental linear PCA in the kernel induced feature space, and constructing reduced-set expansions to maintain constant update speed and memory usage. Expand
Novelty Detection in Learning Systems
Novelty detection is concerned with recognising inputs that differ in some way from those that are usually seen. It is a useful technique in cases where an important class of data isExpand
A review of novelty detection
This review aims to provide an updated and structured investigation of novelty detection research papers that have appeared in the machine learning literature during the last decade. Expand
Efficient Kernel Discriminant Analysis via Spectral Regression
  • Deng Cai, X. He, Jiawei Han
  • Computer Science, Mathematics
  • Seventh IEEE International Conference on Data Mining (ICDM 2007)
  • 2007
By using spectral graph analysis, SRKDA casts discriminant analysis into a regression framework which facilitates both efficient computation and the use of regularization techniques, which is a huge save of computational cost. Expand
Implementation of incremental linear discriminant analysis using singular value decomposition for face recognition
In the proposed ILDA-SVD algorithm, it is proved that the approximation error is mathematically bounded, and the simulation results on Yale database show that the proposed algorithms significantly outperform other well-known systems in terms of recognition rate. Expand