Elsayed, Amgad Monir MohamedOnieva Caracuel, EnriqueWoźniak, Michał2024-11-222024-11-222022-04-14M. Mohammed, A., Onieva, E., & Woźniak, M. (2022). Selective ensemble of classifiers trained on selective samples. Neurocomputing, 482, 197-211. https://doi.org/10.1016/J.NEUCOM.2021.11.0450925-231210.1016/J.NEUCOM.2021.11.045http://hdl.handle.net/20.500.14454/2079Classifier ensembles are characterized by the high quality of classification, thanks to their generalizing ability. Most existing ensemble algorithms use all learning samples to learn the base classifiers that may negatively impact the ensemble's diversity. Also, the existing ensemble pruning algorithms often return suboptimal solutions that are biased by the selection criteria. In this work, we present a proposal to alleviate these drawbacks. We employ an instance selection method to query a reduced training set that reduces both the space complexity of the formed ensemble members and the time complexity to classify an instance. Additionally, we propose a guided search-based pruning schema that perfectly explores large-size ensembles and brings on a near-optimal subensemble with less computational requirements in reduced memory space and improved prediction time. We show experimentally how the proposed method could be an alternative to large-size ensembles. We demonstrate how to form less-complex, small-size, and high-accurate ensembles through our proposal. Experiments on 25 datasets show that the proposed method can produce effective ensembles better than Random Forest and baseline classifier pruning methods. Moreover, our proposition is comparable with the Extreme Gradient Boosting Algorithm in terms of accuracy.engBig dataData reductionDifficult samplesEnsemble pruningEnsemble selectionMachine learningMeta-heuristicsMultiple classifier systemsOrdering-based pruningSelective ensemble of classifiers trained on selective samplesjournal article2024-11-221872-8286