Mitigating Concept Drift via Rejection

  title={Mitigating Concept Drift via Rejection},
  author={Jan Philip G{\"o}pfert and B. Hammer and Heiko Wersing},
  • Jan Philip Göpfert, B. Hammer, Heiko Wersing
  • Published in ICANN 2018
  • Computer Science
  • Learning in non-stationary environments is challenging, because under such conditions the common assumption of independent and identically distributed data does not hold; when concept drift is present it necessitates continuous system updates. In recent years, several powerful approaches have been proposed. However, these models typically classify any input, regardless of their confidence in the classification – a strategy, which is not optimal, particularly in safety-critical environments… CONTINUE READING

    Figures, Tables, and Topics from this paper.


    Publications referenced by this paper.
    Tackling heterogeneous concept drift with the Self-Adjusting Memory (SAM)
    • 20
    • PDF
    KNN Classifier with Self Adjusting Memory for Heterogeneous Concept Drift
    • 76
    • PDF
    Learning in Nonstationary Environments: A Survey
    • 347
    • PDF
    Adaptive random forests for evolving data stream classification
    • 141
    • Highly Influential
    • PDF
    Self-Adjusting Reject Options in Prototype Based Classification
    • 14
    Classification with a Reject Option using a Hinge Loss
    • 281
    • PDF
    A Survey on Ensemble Learning for Data Stream Classification
    • 166
    Optimal local rejection for classifiers
    • 18
    • PDF
    Droplet Ensemble Learning on Drifting Data Streams
    • 2
    Classification with a reject option under Concept Drift: The Droplets algorithm
    • 8