Multi-armed Bandit-Based Metaheuristic Operator Selection: The Pendulum Algorithm Binarization Case

Pablo Ábrego-Calderón, Broderick Crawford, Ricardo Soto, Eduardo Rodriguez-Tello, Felipe Cisternas-Caneo, Eric Monfroy, Giovanni Giachetti

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Multi-armed bandit (MAB) is a well-known reinforcement learning algorithm that has shown outstanding performance for recommendation systems and other areas. On the other hand, metaheuristic algorithms have gained much popularity due to their great performance in solving complex problems with endless search spaces. Pendulum Search Algorithm (PSA) is a recently created metaheuristic inspired by the harmonic motion of a pendulum. Its main limitation is to solve combinatorial optimization problems, characterized by using variables in the discrete domain. To overcome this limitation, we propose to use a two-step binarization technique, which offers a large number of possible options that we call scheme. For this, we use MAB as an algorithm that learns and recommends a binarization schemes during the execution of the iterations (online). With the experiments carried out, we show that it delivers better results in solving the Set Covering problem than using a fixed binarization scheme.

Original languageEnglish
Title of host publicationOptimization and Learning - 6th International Conference, OLA 2023, Proceedings
EditorsBernabé Dorronsoro, Francisco Chicano, Gregoire Danoy, El-Ghazali Talbi
PublisherSpringer Science and Business Media Deutschland GmbH
Pages248-259
Number of pages12
ISBN (Print)9783031340192
DOIs
StatePublished - 2023
Event6th International Conference on Optimization and Learning, OLA 2023 - Malaga, Spain
Duration: 3 May 20235 May 2023

Publication series

NameCommunications in Computer and Information Science
Volume1824 CCIS
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Conference

Conference6th International Conference on Optimization and Learning, OLA 2023
Country/TerritorySpain
CityMalaga
Period3/05/235/05/23

Keywords

  • Binarization Schemes
  • Multi-Armed Bandit
  • Pendulum Search Algorithm
  • Reinforcement Learning
  • Set Covering Problem

Fingerprint

Dive into the research topics of 'Multi-armed Bandit-Based Metaheuristic Operator Selection: The Pendulum Algorithm Binarization Case'. Together they form a unique fingerprint.

Cite this