Bayesian Optimization Algorithm: A Promising Approach for Fitness Inheritance

From Simple Sci Wiki
Revision as of 15:24, 24 December 2023 by SatoshiNakamoto (talk | contribs) (Created page with "Title: Bayesian Optimization Algorithm: A Promising Approach for Fitness Inheritance Abstract: This study explores the concept of fitness inheritance in the Bayesian optimization algorithm (BOA). The goal is to estimate the fitness of some candidate solutions to reduce the number of expensive fitness evaluations. Bayesian networks, which are used in BOA to model promising solutions and generate new ones, are extended to allow for fitness inheritance. The results indicat...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Title: Bayesian Optimization Algorithm: A Promising Approach for Fitness Inheritance

Abstract: This study explores the concept of fitness inheritance in the Bayesian optimization algorithm (BOA). The goal is to estimate the fitness of some candidate solutions to reduce the number of expensive fitness evaluations. Bayesian networks, which are used in BOA to model promising solutions and generate new ones, are extended to allow for fitness inheritance. The results indicate that this approach is promising, as it leads to accurate fitness estimates even if only a small proportion of candidate solutions are evaluated using the actual fitness function. This can result in a significant reduction in the number of actual fitness evaluations.

Introduction: Genetic and evolutionary algorithms (GEAs) often need to maintain a large population of candidate solutions for many iterations to ensure reliable convergence to a global optimum. However, in many real-world problems, evaluating the fitness of candidate solutions can be computationally expensive. This study investigates whether it is possible to make GEAs evolve not only the population of candidate solutions but also a model of fitness, which can be used to evaluate a certain proportion of newly generated candidate solutions (fitness inheritance).

Methodology: The study focuses on the Bayesian optimization algorithm (BOA), which is a type of PMBGA. BOA uses Bayesian networks to model promising solutions and generate new ones. Two types of models are considered: (1) traditional Bayesian networks with full conditional probability tables (CPTs) used in BOA and (2) Bayesian networks with local structures used in BOA with decision graphs (dBOA) and the hierarchical BOA (hBOA). The study examines the proposed method for fitness inheritance in BOA on three example problems: onemax, concatenated traps of order 4, and concatenated traps of order 5.

Results: The study finds that fitness inheritance is beneficial in BOA, even if only less than 1% of candidate solutions are evaluated using the actual fitness function. It turns out that due to the population sizing requirements for creating a correct model of promising solutions, the more fitness inheritance, the better.

Conclusion: The study concludes that fitness inheritance is a promising concept in BOA. The results indicate that using the model in BOA to estimate fitness of newly generated candidate solutions can lead to significant reductions in the number of actual fitness evaluations. This approach seems to be a promising method for reducing the computational complexity of GEAs in real-world problems.

Link to Article: https://arxiv.org/abs/0402032v1 Authors: arXiv ID: 0402032v1