Improving the weighted distribution estimation for AdaBoost using a novel concurrent approach

Héctor Allende-Cid, Carlos Valle, Claudio Moraga, Héctor Allende, Rodrigo Salas

Resultado de la investigación: Contribución a una revistaArtículorevisión exhaustiva

1 Cita (Scopus)

Resumen

AdaBoost is one of the most known Ensemble approaches used in the Machine Learning literature. Several AdaBoost approaches that use Parallel processing, in order to speed up the computation in Large datasets, have been recently proposed. These approaches try to approximate the classic AdaBoost, thus sacrificing its generalization ability. In this work, we use Concurrent Computing in order to improve the Distribution Weight estimation, hence obtaining improvements in the capacity of generalization. We train in parallel in each round several weak hypotheses, and using a weighted ensemble we update the distribution weights of the following boosting rounds. Our results show that in most cases the performance of AdaBoost is improved and that the algorithm converges rapidly. We validate our proposal with 4 well-known real data sets.

Idioma original Inglés 223-232 10 Studies in Computational Intelligence 616 https://doi.org/10.1007/978-3-319-25017-5_21 Publicada - 2016 Sí

Huella

Profundice en los temas de investigación de 'Improving the weighted distribution estimation for AdaBoost using a novel concurrent approach'. En conjunto forman una huella única.