Selective ensemble of classifiers trained on selective samples

dc.contributor.authorElsayed, Amgad Monir Mohamed
dc.contributor.authorOnieva Caracuel, Enrique
dc.contributor.authorWoźniak, Michał
dc.date.accessioned2024-11-22T08:11:23Z
dc.date.available2024-11-22T08:11:23Z
dc.date.issued2022-04-14
dc.date.updated2024-11-22T08:11:23Z
dc.description.abstractClassifier ensembles are characterized by the high quality of classification, thanks to their generalizing ability. Most existing ensemble algorithms use all learning samples to learn the base classifiers that may negatively impact the ensemble's diversity. Also, the existing ensemble pruning algorithms often return suboptimal solutions that are biased by the selection criteria. In this work, we present a proposal to alleviate these drawbacks. We employ an instance selection method to query a reduced training set that reduces both the space complexity of the formed ensemble members and the time complexity to classify an instance. Additionally, we propose a guided search-based pruning schema that perfectly explores large-size ensembles and brings on a near-optimal subensemble with less computational requirements in reduced memory space and improved prediction time. We show experimentally how the proposed method could be an alternative to large-size ensembles. We demonstrate how to form less-complex, small-size, and high-accurate ensembles through our proposal. Experiments on 25 datasets show that the proposed method can produce effective ensembles better than Random Forest and baseline classifier pruning methods. Moreover, our proposition is comparable with the Extreme Gradient Boosting Algorithm in terms of accuracy.en
dc.description.sponsorshipThis project has received funding from the European Union’s Horizon 2020 research and innovation program under the Marie Skłodowska-Curie grant agreement No. 665959. Besides, this work was supported in part by the LOGISTAR project, funded by the European Union Horizon 2020 Research and Innovation Programme grant agreement No. 769142. Michal Wozniak was supported by the Polish National Science Center under the Grant No. 2017/27/B/ST6/01325.en
dc.identifier.citationM. Mohammed, A., Onieva, E., & Woźniak, M. (2022). Selective ensemble of classifiers trained on selective samples. Neurocomputing, 482, 197-211. https://doi.org/10.1016/J.NEUCOM.2021.11.045
dc.identifier.doi10.1016/J.NEUCOM.2021.11.045
dc.identifier.eissn1872-8286
dc.identifier.issn0925-2312
dc.identifier.urihttp://hdl.handle.net/20.500.14454/2079
dc.language.isoeng
dc.publisherElsevier B.V.
dc.subject.otherBig data
dc.subject.otherData reduction
dc.subject.otherDifficult samples
dc.subject.otherEnsemble pruning
dc.subject.otherEnsemble selection
dc.subject.otherMachine learning
dc.subject.otherMeta-heuristics
dc.subject.otherMultiple classifier systems
dc.subject.otherOrdering-based pruning
dc.titleSelective ensemble of classifiers trained on selective samplesen
dc.typejournal article
dcterms.accessRightsmetadata only access
oaire.citation.endPage211
oaire.citation.startPage197
oaire.citation.titleNeurocomputing
oaire.citation.volume482
Archivos
Colecciones