Selective ensemble of classifiers trained on selective samples
No hay miniatura disponible
Fecha
2022-04-14
Título de la revista
ISSN de la revista
Título del volumen
Editor
Elsevier B.V.
Resumen
Classifier ensembles are characterized by the high quality of classification, thanks to their generalizing ability. Most existing ensemble algorithms use all learning samples to learn the base classifiers that may negatively impact the ensemble's diversity. Also, the existing ensemble pruning algorithms often return suboptimal solutions that are biased by the selection criteria. In this work, we present a proposal to alleviate these drawbacks. We employ an instance selection method to query a reduced training set that reduces both the space complexity of the formed ensemble members and the time complexity to classify an instance. Additionally, we propose a guided search-based pruning schema that perfectly explores large-size ensembles and brings on a near-optimal subensemble with less computational requirements in reduced memory space and improved prediction time. We show experimentally how the proposed method could be an alternative to large-size ensembles. We demonstrate how to form less-complex, small-size, and high-accurate ensembles through our proposal. Experiments on 25 datasets show that the proposed method can produce effective ensembles better than Random Forest and baseline classifier pruning methods. Moreover, our proposition is comparable with the Extreme Gradient Boosting Algorithm in terms of accuracy.
Descripción
Materias
Cita
M. Mohammed, A., Onieva, E., & Woźniak, M. (2022). Selective ensemble of classifiers trained on selective samples. Neurocomputing, 482, 197-211. https://doi.org/10.1016/J.NEUCOM.2021.11.045