Loading…

Improved Runtime Bounds for the Univariate Marginal Distribution Algorithm via Anti-Concentration

Unlike traditional evolutionary algorithms which produce offspring via genetic operators, Estimation of Distribution Algorithms (EDAs) sample solutions from probabilistic models which are learned from selected individuals. It is hoped that EDAs may improve optimisation performance on epistatic fitne...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2018-02
Main Authors: Lehre, Per Kristian, Phan Trung Hai Nguyen
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Unlike traditional evolutionary algorithms which produce offspring via genetic operators, Estimation of Distribution Algorithms (EDAs) sample solutions from probabilistic models which are learned from selected individuals. It is hoped that EDAs may improve optimisation performance on epistatic fitness landscapes by learning variable interactions. However, hardly any rigorous results are available to support claims about the performance of EDAs, even for fitness functions without epistasis. The expected runtime of the Univariate Marginal Distribution Algorithm (UMDA) on OneMax was recently shown to be in \(\mathcal{O}\left(n\lambda\log \lambda\right)\) by Dang and Lehre (GECCO 2015). Later, Krejca and Witt (FOGA 2017) proved the lower bound \(\Omega\left(\lambda\sqrt{n}+n\log n\right)\) via an involved drift analysis. We prove a \(\mathcal{O}\left(n\lambda\right)\) bound, given some restrictions on the population size. This implies the tight bound \(\Theta\left(n\log n\right)\) when \(\lambda=\mathcal{O}\left(\log n\right)\), matching the runtime of classical EAs. Our analysis uses the level-based theorem and anti-concentration properties of the Poisson-Binomial distribution. We expect that these generic methods will facilitate further analysis of EDAs.
ISSN:2331-8422
DOI:10.48550/arxiv.1802.00721