Improving the weighted distribution estimation for AdaBoost using a novel concurrent approach

Héctor Allende-Cid, Carlos Valle, Claudio Moraga, Héctor Allende, Rodrigo Salas

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

AdaBoost is one of the most known Ensemble approaches used in the Machine Learning literature. Several AdaBoost approaches that use Parallel processing, in order to speed up the computation in Large datasets, have been recently proposed. These approaches try to approximate the classic AdaBoost, thus sacrificing its generalization ability. In this work, we use Concurrent Computing in order to improve the Distribution Weight estimation, hence obtaining improvements in the capacity of generalization. We train in parallel in each round several weak hypotheses, and using a weighted ensemble we update the distribution weights of the following boosting rounds. Our results show that in most cases the performance of AdaBoost is improved and that the algorithm converges rapidly. We validate our proposal with 4 well-known real data sets.

Original languageEnglish
Pages (from-to)223-232
Number of pages10
JournalStudies in Computational Intelligence
Volume616
DOIs
StatePublished - 2016

Fingerprint

Dive into the research topics of 'Improving the weighted distribution estimation for AdaBoost using a novel concurrent approach'. Together they form a unique fingerprint.

Cite this