TY - JFULL AU - S. B. Kotsiantis and P. E. Pintelas PY - 2007/9/ TI - Combining Bagging and Boosting T2 - International Journal of Mathematical and Computational Sciences SP - 371 EP - 381 VL - 1 SN - 1307-6892 UR - https://publications.waset.org/pdf/4742 PU - World Academy of Science, Engineering and Technology NX - Open Science Index 8, 2007 N2 - Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging and boosting ensembles with 10 subclassifiers in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique was the most accurate. ER -