Recently, several classification algorithms capable of dealing with potentially infinite data streams have been proposed. One of the main challenges of this task is to continuously update predictive models to address concept drifts without compromise their predictive performance. Moreover, the classification algorithm used must be able to efficiently deal with processing time and memory limitations. In the data stream mining literature, ensemble-based classification algorithms are a good alternative to satisfy the previous requirements. These algorithms combine multiple weak learner algorithms, e.g., the Very Fast Decision Tree (VFDT), to create a model with higher predictive performance. However, the memory costs of each weak learner are stacked in an ensemble, compromising the limited space requirements. To manage the trade-off between accuracy, memory space, and processing time, this paper proposes to use the Strict VFDT (SVFDT) algorithm as an alternative weak learner for ensemble solutions which is capable of reducing memory consumption without harming the predictive performance. This paper experimentally compares two traditional and three state-of-the-art ensembles using as weak learners the VFDT and SVFDT across thirteen benchmark datasets. According to the experimental results, the proposed algorithm can obtain a similar predictive performance with a significant economy of memory space.

Making data stream classification tree-based ensembles lighter

Barbon Junior S
2018-01-01

Abstract

Recently, several classification algorithms capable of dealing with potentially infinite data streams have been proposed. One of the main challenges of this task is to continuously update predictive models to address concept drifts without compromise their predictive performance. Moreover, the classification algorithm used must be able to efficiently deal with processing time and memory limitations. In the data stream mining literature, ensemble-based classification algorithms are a good alternative to satisfy the previous requirements. These algorithms combine multiple weak learner algorithms, e.g., the Very Fast Decision Tree (VFDT), to create a model with higher predictive performance. However, the memory costs of each weak learner are stacked in an ensemble, compromising the limited space requirements. To manage the trade-off between accuracy, memory space, and processing time, this paper proposes to use the Strict VFDT (SVFDT) algorithm as an alternative weak learner for ensemble solutions which is capable of reducing memory consumption without harming the predictive performance. This paper experimentally compares two traditional and three state-of-the-art ensembles using as weak learners the VFDT and SVFDT across thirteen benchmark datasets. According to the experimental results, the proposed algorithm can obtain a similar predictive performance with a significant economy of memory space.
2018
9781538680230
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11368/3004444
 Avviso

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 3
social impact