Abstract
Learning in nonstationary environments, also known as learning concept drift, is concerned with learning from data whose statistical characteristics change over time. Concept drift is further complicated if the data set is class imbalanced. While these two issues have been independently addressed, their joint treatment has been mostly underexplored. We describe two ensemble-based approaches for learning concept drift from imbalanced data. Our first approach is a logical combination of our previously introduced Learn++.NSE algorithm for concept drift, with the well-established SMOTE for learning from imbalanced data. Our second approach makes two major modifications to Learn++.NSE-SMOTE integration by replacing SMOTE with a subensemble that makes strategic use of minority class data; and replacing Learn++.NSE and its class-independent error weighting mechanism with a penalty constraint that forces the algorithm to balance accuracy on all classes. The primary novelty of this approach is in determining the voting weights for combining ensemble members, based on each classifier's time and imbalance-adjusted accuracy on current and past environments. Favorable results in comparison to other approaches indicate that both approaches are able to address this challenging problem, each with its own specific areas of strength. We also release all experimental data as a resource and benchmark for future research.
Original language | English (US) |
---|---|
Article number | 6235959 |
Pages (from-to) | 2283-2301 |
Number of pages | 19 |
Journal | IEEE Transactions on Knowledge and Data Engineering |
Volume | 25 |
Issue number | 10 |
DOIs | |
State | Published - 2013 |
Externally published | Yes |
Keywords
- Incremental learning
- class imbalance
- concept drift
- multiple classifier systems
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Computational Theory and Mathematics