Robust alternating AdaBoost

Héctor Allende-Cid, Rodrigo Salas, Héctor Allende, Ricardo Ñanculef

Resultado de la investigación: Capítulo del libro/informe/acta de congresoContribución a la conferenciarevisión exhaustiva

11 Citas (Scopus)

Resumen

Ensemble methods are general techniques to improve the accuracy of any given learning algorithm. Boosting is a learning algorithm that builds the classifier ensembles incrementally. In this work we propose an improvement of the classical and inverse AdaBoost algorithms to deal with the problem of the presence of outliers in the data. We propose the Robust Alternating AdaBoost (RADA) algorithm that alternates between the classic and inverse AdaBoost to create a more stable algorithm. The RADA algorithm bounds the influence of the outliers to the empirical distribution, it detects and diminishes the empirical probability of "bad" samples, and it performs a more accurate classification under contaminated data. We report the performance results using synthetic and real datasets, the latter obtained from a benchmark site.

Idioma originalInglés
Título de la publicación alojadaProgress in Pattern Recognition, Image Analysis and Applications - 12th Iberoamerican Congress on Pattern Recognition, CIARP 2007, Proceedings
Páginas427-436
Número de páginas10
EstadoPublicada - 2007
Publicado de forma externa
Evento12th Iberoamerican Congress on Pattern Recognition, CIARP 2007 - Vina del Mar-Valparaiso, Chile
Duración: 13 nov. 200716 nov. 2007

Serie de la publicación

NombreLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volumen4756 LNCS
ISSN (versión impresa)0302-9743
ISSN (versión digital)1611-3349

Conferencia

Conferencia12th Iberoamerican Congress on Pattern Recognition, CIARP 2007
País/TerritorioChile
CiudadVina del Mar-Valparaiso
Período13/11/0716/11/07

Huella

Profundice en los temas de investigación de 'Robust alternating AdaBoost'. En conjunto forman una huella única.

Citar esto