Selective ensemble of classifiers trained on selective samples
dc.contributor.author | Elsayed, Amgad Monir Mohamed | |
dc.contributor.author | Onieva Caracuel, Enrique | |
dc.contributor.author | Woźniak, Michał | |
dc.date.accessioned | 2024-11-22T08:11:23Z | |
dc.date.available | 2024-11-22T08:11:23Z | |
dc.date.issued | 2022-04-14 | |
dc.date.updated | 2024-11-22T08:11:23Z | |
dc.description.abstract | Classifier ensembles are characterized by the high quality of classification, thanks to their generalizing ability. Most existing ensemble algorithms use all learning samples to learn the base classifiers that may negatively impact the ensemble's diversity. Also, the existing ensemble pruning algorithms often return suboptimal solutions that are biased by the selection criteria. In this work, we present a proposal to alleviate these drawbacks. We employ an instance selection method to query a reduced training set that reduces both the space complexity of the formed ensemble members and the time complexity to classify an instance. Additionally, we propose a guided search-based pruning schema that perfectly explores large-size ensembles and brings on a near-optimal subensemble with less computational requirements in reduced memory space and improved prediction time. We show experimentally how the proposed method could be an alternative to large-size ensembles. We demonstrate how to form less-complex, small-size, and high-accurate ensembles through our proposal. Experiments on 25 datasets show that the proposed method can produce effective ensembles better than Random Forest and baseline classifier pruning methods. Moreover, our proposition is comparable with the Extreme Gradient Boosting Algorithm in terms of accuracy. | en |
dc.description.sponsorship | This project has received funding from the European Union’s Horizon 2020 research and innovation program under the Marie Skłodowska-Curie grant agreement No. 665959. Besides, this work was supported in part by the LOGISTAR project, funded by the European Union Horizon 2020 Research and Innovation Programme grant agreement No. 769142. Michal Wozniak was supported by the Polish National Science Center under the Grant No. 2017/27/B/ST6/01325. | en |
dc.identifier.citation | M. Mohammed, A., Onieva, E., & Woźniak, M. (2022). Selective ensemble of classifiers trained on selective samples. Neurocomputing, 482, 197-211. https://doi.org/10.1016/J.NEUCOM.2021.11.045 | |
dc.identifier.doi | 10.1016/J.NEUCOM.2021.11.045 | |
dc.identifier.eissn | 1872-8286 | |
dc.identifier.issn | 0925-2312 | |
dc.identifier.uri | http://hdl.handle.net/20.500.14454/2079 | |
dc.language.iso | eng | |
dc.publisher | Elsevier B.V. | |
dc.subject.other | Big data | |
dc.subject.other | Data reduction | |
dc.subject.other | Difficult samples | |
dc.subject.other | Ensemble pruning | |
dc.subject.other | Ensemble selection | |
dc.subject.other | Machine learning | |
dc.subject.other | Meta-heuristics | |
dc.subject.other | Multiple classifier systems | |
dc.subject.other | Ordering-based pruning | |
dc.title | Selective ensemble of classifiers trained on selective samples | en |
dc.type | journal article | |
dcterms.accessRights | metadata only access | |
oaire.citation.endPage | 211 | |
oaire.citation.startPage | 197 | |
oaire.citation.title | Neurocomputing | |
oaire.citation.volume | 482 |