Mutation Rate Matters Even When Optimizing Monotonic Functions

Benjamin Doerr, Thomas Jansen, Dirk Sudholt, Carola Winzen, Christine Zarges

Research output: Contribution to journalArticlepeer-review

62 Citations (SciVal)

Abstract

Extending previous analyses on function classes like linear functions, we analyze how the simple (1+1) evolutionary algorithm optimizes pseudo-Boolean functions that are strictly monotonic. These functions have the property that whenever only 0-bit are changed to 1, then the objective value strictly increases. Contrary to what one would expect, not all of these functions are easy to optimize. The choice of the constant c in the mutation probability p(n) = c/n can make a decisive difference.

We show that if c < 1, then the (1+1) EA finds the optimum of every such function in Θ(n log n) iterations. For c=1, we can still prove an upper bound of O(n3/2). However, for c ≥ 16, we present a strictly monotonic function such that the (1+1) EA with overwhelming probability needs 2Ω(n) iterations to find the optimum. This is the first time that we observe that a constant factor change of the mutation probability changes the runtime by more than a constant factor.
Original languageEnglish
Pages (from-to)1-27
JournalEvolutionary Computation
Volume21
Issue number1
Early online date12 Mar 2012
DOIs
Publication statusPublished - 2013

Fingerprint

Dive into the research topics of 'Mutation Rate Matters Even When Optimizing Monotonic Functions'. Together they form a unique fingerprint.

Cite this