Loading [MathJax]/extensions/MathMenu.js
Concept Drift Detection via Equal Intensity k-Means Space Partitioning | IEEE Journals & Magazine | IEEE Xplore

Concept Drift Detection via Equal Intensity k-Means Space Partitioning


Abstract:

The data stream poses additional challenges to statistical classification tasks because distributions of the training and target samples may differ as time passes. Such a...Show More

Abstract:

The data stream poses additional challenges to statistical classification tasks because distributions of the training and target samples may differ as time passes. Such a distribution change in streaming data is called concept drift. Numerous histogram-based distribution change detection methods have been proposed to detect drift. Most histograms are developed on the grid-based or tree-based space partitioning algorithms which makes the space partitions arbitrary, unexplainable, and may cause drift blind spots. There is a need to improve the drift detection accuracy for the histogram-based methods with the unsupervised setting. To address this problem, we propose a cluster-based histogram, called equal intensity k-means space partitioning (EI-kMeans). In addition, a heuristic method to improve the sensitivity of drift detection is introduced. The fundamental idea of improving the sensitivity is to minimize the risk of creating partitions in distribution offset regions. Pearson's chi-square test is used as the statistical hypothesis test so that the test statistics remain independent of the sample distribution. The number of bins and their shapes, which strongly influence the ability to detect drift, are determined dynamically from the sample based on an asymptotic constraint in the chi-square test. Accordingly, three algorithms are developed to implement concept drift detection, including a greedy centroids initialization algorithm, a cluster amplify-shrink algorithm, and a drift detection algorithm. For drift adaptation, we recommend retraining the learner if a drift is detected. The results of experiments on the synthetic and real-world datasets demonstrate the advantages of EI-kMeans and show its efficacy in detecting concept drift.
Published in: IEEE Transactions on Cybernetics ( Volume: 51, Issue: 6, June 2021)
Page(s): 3198 - 3211
Date of Publication: 22 April 2020

ISSN Information:

PubMed ID: 32324590

Funding Agency:

References is not available for this document.

I. Introduction

Streaming data classification consists of a routine where a model is trained on the historical data and then used to classify upcoming samples. When the labels of the newly arrived samples are available, they become a part of the training data. Concept drift refers to inconsistencies in data generation at different times, which means the training data and the testing data have different distributions [1]–[3]. Drift detection aims to identify these differences with a statistical guarantee through what is, typically, a four-step process [4]: 1) cut data stream into chunks as training/testing sets; 2) abstract the datasets into a comparable model; 3) develop a test statistical or similarity measurement to quantify the distance between the models; and 4) design a hypothesis test to investigate the null hypothesis (most often, the null hypothesis is that there is no concept drift).

Select All
1.
G. Boracchi, D. Carrera, C. Cervellera and D. Maccio, "QuantTree: Histograms for change detection in multivariate data streams", Proc. Int. Conf. Mach. Learn., pp. 638-647, 2018.
2.
L. Rutkowski, M. Jaworski, L. Pietruczuk and P. Duda, "Decision trees for mining data streams based on the Gaussian approximation", IEEE Trans. Knowl. Data Eng., vol. 26, no. 1, pp. 108-119, Jan. 2014.
3.
L. L. Minku and X. Yao, "DDD a new ensemble approach for dealing with concept drift", IEEE Trans. Knowl. Data Eng., vol. 24, no. 4, pp. 619-633, Apr. 2012.
4.
J. Lu, A. Liu, F. Dong, F. Gu, J. Gama and G. Zhang, "Learning under concept drift: A review", IEEE Trans. Knowl. Data Eng., vol. 31, no. 12, pp. 2346-2363, Dec. 2019.
5.
Y. Sun, K. Tang, L. L. Minku, S. Wang and X. Yao, "Online ensemble learning of data streams with gradually evolved classes", IEEE Trans. Knowl. Data Eng., vol. 28, no. 6, pp. 1532-1545, Jun. 2016.
6.
S. Wang, L. L. Minku and X. Yao, "Resampling-based ensemble methods for online class imbalance learning", IEEE Trans. Knowl. Data Eng., vol. 27, no. 5, pp. 1356-1368, May 2015.
7.
L. Bu, C. Alippi and D. Zhao, "A pdf-free change detection test based on density difference estimation", IEEE Trans. Neural Netw. Learn. Syst., vol. 29, no. 2, pp. 324-334, Feb. 2018.
8.
F. Liu, G. Zhang and J. Lu, "Heterogeneous domain adaptation: An unsupervised approach", IEEE Trans. Neural Netw. Learn. Syst..
9.
I. Žliobaitė, M. Pechenizkiy and J. Gama, "An overview of concept drift applications" in Big Data Analysis: New Algorithms for a New Society, Cham, Switzerland:Springer, pp. 91-114, 2016, [online] Available: https://link.springer.com/chapter/10.1007/978-3-319-26989-4_4#citeas.
10.
A. Bifet and R. Gavaldà, "Learning from time-changing data with adaptive windowing", Proc. SIAM Int. Conf. Data Min., pp. 443-448, 2007.
11.
L. Bu, D. Zhao and C. Alippi, "An incremental change detection test based on density difference estimation", IEEE Trans. Syst. Man Cybern. Syst., vol. 47, no. 10, pp. 2714-2726, Oct. 2017.
12.
R. Polikar, L. Upda, S. S. Upda and V. Honavar, "Learn++: An incremental learning algorithm for supervised neural networks", IEEE Trans. Syst. Man Cybern. C Appl. Rev., vol. 31, no. 4, pp. 497-508, Nov. 2001.
13.
N. Lu, G. Zhang and J. Lu, "Concept drift detection via competence models", Artif. Intell., vol. 209, pp. 11-28, Apr. 2014.
14.
A. Liu, J. Lu, F. Liu and G. Zhang, "Accumulating regional density dissimilarity for concept drift detection in data streams", Pattern Recognit., vol. 76, pp. 256-272, Apr. 2018.
15.
D. M. dos Reis, P. A. Flach, S. Matwin and G. E. A. P. A. Batista, "Fast unsupervised online drift detection using incremental Kolmogorov–Smirnov test", Proc. 22nd ACM SIGKDD Int. Conf. Knowl. Disc. Data Min., pp. 1545-1554, 2016.
16.
B. W. Silverman, Density Estimation for Statistics and Data Analysis, Boca Raton, IL, USA:Routledge, 2018.
17.
R. A. Finkel and J. L. Bentley, "Quad trees a data structure for retrieval on composite keys", Acta Informatica, vol. 4, no. 1, pp. 1-9, 1974.
18.
G. Boracchi, C. Cervellera and D. Macciá, "Uniform histograms for change detection in multivariate data", Proc. IEEE Int. Joint Conf. Neural Netw., pp. 1732-1739, 2017.
19.
L. Wu, Y. Wang and S. Pan, "Exploiting attribute correlations: A novel trace lasso-based weakly supervised dictionary learning method", IEEE Trans. Cybern., vol. 47, no. 12, pp. 4497-4508, Dec. 2017.
20.
J. G. Moreno-Torres, T. Raeder, R. Alaiz-Rodríguez, N. V. Chawla and F. Herrera, "A unifying view on dataset shift in classification", Pattern Recognit., vol. 45, no. 1, pp. 521-530, 2012.
21.
J. Sun, H. Li and H. Adeli, "Concept drift-oriented adaptive and dynamic support vector machine ensemble with time window in corporate financial risk prediction", IEEE Trans. Syst. Man Cybern. Syst., vol. 43, no. 4, pp. 801-813, Jul. 2013.
22.
M. Das, M. Pratama, S. Savitri and Z. Jie, "Muse-RNN: A multilayer self-evolving recurrent neural network for data stream classification", Proc. 19th IEEE Int. Conf. Data Min., pp. 110-119, 2019.
23.
J. Gama, I. Zliobaite, A. Bifet, M. Pechenizkiy and A. Bouchachia, "A survey on concept drift adaptation", ACM Comput. Surveys, vol. 46, no. 4, pp. 1-37, 2014.
24.
N. Lu, J. Lu, G. Zhang and R. L. De Mantaras, "A concept drift-tolerant case-base editing technique", Artif. Intell., vol. 230, pp. 108-133, Jan. 2016.
25.
A. Liu, Y. Song, G. Zhang and J. Lu, "Regional concept drift detection and density synchronized drift adaptation", Proc. 26th Int. Joint Conf. Artif. Intell., pp. 2280-2286, 2017.
26.
C. Alippi, G. Boracchi and M. Roveri, "Hierarchical change-detection tests", IEEE Trans. Neural Netw. Learn. Syst., vol. 28, no. 2, pp. 246-258, Feb. 2017.
27.
G. Ditzler, M. Roveri, C. Alippi and R. Polikar, "Learning in non-stationary environments: A survey", IEEE Comput. Intell. Mag., vol. 10, no. 4, pp. 12-25, Nov. 2015.
28.
S. Ramírez-Gallego, B. Krawczyk, S. García, M. Woźniak and F. Herrera, "A survey on data preprocessing for data stream mining: Current status and future directions", Neurocomputing, vol. 239, pp. 39-57, May 2017.
29.
M. Pratama, C. Za’in, A. Ashfahani, Y. S. Ong and W. Ding, "Automatic construction of multi-layer perceptron network from streaming examples", Proc. 28th ACM Int. Conf. Inf. Knowl. Manag., pp. 1171-1180, 2019.
30.
S. Wang, L. L. Minku and X. Yao, "A systematic study of online class imbalance learning with concept drift", IEEE Trans. Neural Netw. Learn. Syst., vol. 29, no. 10, pp. 4802-4821, Oct. 2018.
Contact IEEE to Subscribe

References

References is not available for this document.