Robust alternating AdaBoost

Héctor Allende-Cid, Rodrigo Salas, Héctor Allende, Ricardo Ñanculef

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

12 Scopus citations


Ensemble methods are general techniques to improve the accuracy of any given learning algorithm. Boosting is a learning algorithm that builds the classifier ensembles incrementally. In this work we propose an improvement of the classical and inverse AdaBoost algorithms to deal with the problem of the presence of outliers in the data. We propose the Robust Alternating AdaBoost (RADA) algorithm that alternates between the classic and inverse AdaBoost to create a more stable algorithm. The RADA algorithm bounds the influence of the outliers to the empirical distribution, it detects and diminishes the empirical probability of "bad" samples, and it performs a more accurate classification under contaminated data. We report the performance results using synthetic and real datasets, the latter obtained from a benchmark site.

Original languageEnglish
Title of host publicationProgress in Pattern Recognition, Image Analysis and Applications - 12th Iberoamerican Congress on Pattern Recognition, CIARP 2007, Proceedings
Number of pages10
StatePublished - 2007
Externally publishedYes
Event12th Iberoamerican Congress on Pattern Recognition, CIARP 2007 - Vina del Mar-Valparaiso, Chile
Duration: 13 Nov 200716 Nov 2007

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume4756 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference12th Iberoamerican Congress on Pattern Recognition, CIARP 2007
CityVina del Mar-Valparaiso


  • AdaBoost
  • Machine ensembles
  • Robust learning algorithms


Dive into the research topics of 'Robust alternating AdaBoost'. Together they form a unique fingerprint.

Cite this