Learning in the presence of concept drift and hidden contexts

@article{Widmer2004LearningIT,
  title={Learning in the presence of concept drift and hidden contexts},
  author={Gerhard Widmer and Miroslav Kub{\'a}t},
  journal={Machine Learning},
  year={2004},
  volume={23},
  pages={69-101}
}
On-line learning in domains where the target concept depends on some hidden context poses serious problems. A changing context can induce changes in the target concepts, producing what is known as concept drift. We describe a family of learning algorithms that flexibly react to concept drift and can take advantage of situations where contexts reappear. The general approach underlying all these algorithms consists of (1) keeping only a window of currently trusted examples and hypotheses; (2… 
1 A Survey on Concept Drift Adaptation
TLDR
This introduction to the concept drift adaptation presents the state of the art techniques and a collection of benchmarks for researchers, industry analysts and practitioners to reflect on the existing scattered state-of-the-art.
Detecting and adapting to drifting concepts
TLDR
An ensemble learning method for supervised learning with drifting concepts that could consistently recognize different types of drift, adapt quickly to these changes to maintain its performance level, and utilize the former knowledge to improve its performance for recurring context is presented.
Characterizing concept drift
TLDR
This work presents the first comprehensive framework for quantitative analysis of drift, giving rise to a new comprehensive taxonomy of concept drift types and a solid foundation for research into mechanisms to detect and address concept drift.
A Review on Dynamic Concept Drift
TLDR
Dynamic concept drift is one of the techniques which will handle the concept drift effectively by using ensembles, and with Naïve base classifier is finest one.
Learning in Environments with Unknown Dynamics: Towards more Robust Concept Learners
TLDR
An incremental decision tree that is updated with incoming examples and is better than evaluated methods in its ability to deal with concept drift when dealing with problems in which: concept change occurs at different speeds, noise may be present and, examples may arrive from different areas of the problem domain.
From Concept Drift to Model Degradation: An Overview on Performance-Aware Drift Detectors
TLDR
A comprehensive analysis of the main attributes and strategies for tracking and evaluating the model’s performance in the Preprint submitted to Journal Name March 22, 2022 ar X iv :2 20 3.
An Overview on Concept Drift Learning
TLDR
This paper surveyed several works that deal with concept drift, as well as presented a comprehensive study of public synthetic and real datasets that can be used to cope with such a problem.
Learning Recurring Concepts from Data Streams in Ubiquitous Environments
TLDR
This PhD thesis addresses the aforementioned open issues, focusing on learning anytime, anywhere classification models from data streams in ubiquitous environments, where the underlying concepts may change over time, with special emphasis on recurring concepts.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 51 REFERENCES
Combining Robustness and Flexibility in Learning Drifting Concepts
TLDR
An algorithm is presented that is both robust against noise and quick at recognizing and adapting to changes in the target concepts, and is implemented in a system named FLORA4, the latest member of a whole family of learning algorithms.
Effective Learning in Dynamic Environments by Explicit Context Tracking
TLDR
By explicitly storing old hypotheses and re-using them to bias learning in new contexts, the method possesses the ability to utilize experience from previous learning, which greatly increases the system's effectiveness in environments where contexts can reoccur periodically.
Learning Flexible Concepts from Streams of Examples: FLORA 2
TLDR
FLORA2 is a program for supervised learning of concepts that are subject to concept drift that keeps in memory not only valid descriptions of the concepts as they are derived from the objects currently present in the window, but alsòcandidate descriptions' that may turn into valid descriptions in the future.
COBBIT - A Control Procedure for COBWEB in the Presence of Concept Drift
TLDR
Six mechanisms that can detect concept drift and adjust the conceptual structure are proposed and a variant of one of these mechanisms: dynamic deletion of old examples, is implemented in a modified COBWEB system called COBBIT.
Incremental learning from noisy data
TLDR
This paper first reviews a framework for discussing machine learning systems and then describes STAGGER in that framework, which is based on a distributed concept description which is composed of a set of weighted, symbolic characterizations.
Tracking drifting concepts by minimizing disagreements
TLDR
This paper shows that if H is properly PAC-learnable, then there is an efficient (randomized) algorithm that with high probability approximately minimizes disagreements to within a factor of 7d + 1, yielding an efficient tracking algorithm forH which tolerates drift rates up to a constant times ε2/(d2 ln 1/ε).
On-line learning with an oblivious environment and the power of randomization
Incrementally Learning Time-Varying Half Planes
TLDR
A distribution-free model for incremental learning when concepts vary with time is presented and it is shown that the average mistake rate depends on the maximum rate at which an adversary can modify the concept.
Robust Classification with Context-Sensitive Features
TLDR
This paper addresses the problem of classifying observations when features are context-sensitive, especially when the testing set involves a context that is different from the training set, and presents general strategies for enhancing the performance of classification algorithms on this type of problem.
Learning Time-Varying Concepts
TLDR
This work has extended formal definitions of concepts and learning to provide a framework in which an algorithm can track a concept as it evolves over time, and derived some PAC-style sample complexity results that determine, for example, when tracking is feasible.
...
1
2
3
4
5
...