Improving the weighted distribution estimation for AdaBoost using a novel concurrent approach

HÉCTOR GABRIEL ALLENDE CID, Carlos Valle, Claudio Moraga, Héctor Allende, Rodrigo Salas

Resultado de la investigación: Contribución a una revistaArtículorevisión exhaustiva

1 Cita (Scopus)

Resumen

AdaBoost is one of the most known Ensemble approaches used in the Machine Learning literature. Several AdaBoost approaches that use Parallel processing, in order to speed up the computation in Large datasets, have been recently proposed. These approaches try to approximate the classic AdaBoost, thus sacrificing its generalization ability. In this work, we use Concurrent Computing in order to improve the Distribution Weight estimation, hence obtaining improvements in the capacity of generalization. We train in parallel in each round several weak hypotheses, and using a weighted ensemble we update the distribution weights of the following boosting rounds. Our results show that in most cases the performance of AdaBoost is improved and that the algorithm converges rapidly. We validate our proposal with 4 well-known real data sets.

Idioma originalInglés
Páginas (desde-hasta)223-232
Número de páginas10
PublicaciónStudies in Computational Intelligence
Volumen616
DOI
EstadoPublicada - 2016
Publicado de forma externa

Huella

Profundice en los temas de investigación de 'Improving the weighted distribution estimation for AdaBoost using a novel concurrent approach'. En conjunto forman una huella única.

Citar esto