Optimisation test functions
EvoLP includes some test functions to benchmark your algorithms. Unless otherwise specified, every function is of the form $f(x)$.
Pseudo boolean functions
EvoLP.onemax
— FunctionThe OneMax function returns the sum of the individual. For an individual of length $n$, maximum is achieved with $n$ ones.
\[\text{OneMax}(\mathbf{x}) = \sum_{i=1}^n x_i\]
EvoLP.leadingones
— FunctionThe LeadingOnes function returns the number of uninterrupted ones from the start of the chromosome. The maximum is achieved with $n$ ones, but the landscape is a bit more difficult to traverse.
\[\text{LO}(\mathbf{x}) = \sum_{i=1}^n \prod_j^i x_j\]
EvoLP.jumpk
— Functionjumpk(x; k=6)
The JumpK function is a modification of the onemax
function, with a valley of size $k$ right before the maximum. The only way for a hill climber to reach the maximum is with a perfect flip of $k$ bits, which is considered extremely difficult.
\[\text{JUMP}_k(x) = \begin{cases} \lVert{x}\rVert_1 & \quad \text{if } \lVert x \rVert_1 \in [0..n-k] \cup \{n\},\\ -\lVert x \rVert_1 & \quad \text{otherwise} \end{cases}\]
where $\lVert x \rVert_1 = \sum_{i=1}^n x_i$ is the number of 1-bits in $x \in \{0, 1\}^n$.
Continuous functions
Continuous functions implementations were taken from Kochenderfer, M.J. and Wheeler, T.A. (2019). Most of these can be visualised in Surjanovic, S. & Bingham, D. (2013).
EvoLP.ackley
— Functionackley(x; a=20, b=0.2, c=2π)
Ackley's benchmark function. Global minimum is at the origin, with value of 0. Parameters are typically $a=20$, $b=0.2$, and $c=2\pi$.
\[f(x) = -a \exp\left(-b\sqrt{\frac{1}{d} \sum_{i=1}^d x_i^2}\right) - \exp\left(\frac{1}{d} \sum_{i=1}{d} \cos (cx_i) \right) + a + \exp(1)\]
EvoLP.booth
— FunctionThe Booth function is a 2-dimensional quadratic function with global minimum $x^* = [1, 3]$ and optimal value $f(x^*) = 0$.
\[f(x) = (x_1 + 2x_2 - 7)^2 + (2 x_1 + x_2 - 5)^2\]
EvoLP.branin
— Functionbranin(x; a=1, b=5.1/(4π^2), c=5/π, r=6, s=10, t=1/(8π))
The Branin (a.k.a. Branin-Hoo) function has six optional parameters, and features multiple global minima. Some of them are at $x^* = [-\pi, 12.275]$, $x^* = [\pi, 2.275]$, $x^* = [3\pi, 2.475]$ and $x^* = [5\pi, 12.875]$, with $f(x^*) \approx 0.397887$.
\[f(x) = a(x_2 - bx_1^2 + cx_1 - r)^2 + s(1 - t)\cos(x_1) + s\]
EvoLP.circle
— Functioncircle(x)
This test function will be deprecated in a future major release.
The circle function is a multiobjective test function, given by
\[f(x) = \begin{bmatrix} 1 - r\cos(\theta) \\ 1 - r\sin(\theta) \end{bmatrix}\]
where $\theta=x_1$ and $r$ is obtained by
\[r = \frac{1}{2} + \frac{1}{2} \left(\frac{2x_2}{1+x_2^2}\right)\]
EvoLP.flower
— Functionflower(x; a=1, b=1, c=4)
This function will be deprecated in a future major release.
The flower function is a two-dimensional test function with flower-like contour lines coming out from the origin. Typically, optional parameters are set at $a=1$, $b=1$ and $c=4$. The function is minimised near the origin, although it does not have a global minimum due to atan
bein undefined at $[0, 0]$.
\[f(x) = a\lVert\mathbb{x}\rVert + b \sin(c\arctan(x_2, x_1))\]
EvoLP.michalewicz
— Functionmichalewicz(x; m=10)
The Michalewicz function is a $d$-dimensional function with several steep valleys, where m
controls the steepness. m
is usually set at 10. For 2 dimensions, $x^* = [2.20, 1.57]$, with $f(x^*) = -1.8011$.
\[f(x) = -\sum_{i=1}^{d}\sin(x_i) \sin^{2m}\left(\frac{ix_i^2}{\pi}\right)\]
EvoLP.rosenbrock
— Functionrosenbrock(x; b=100)
This is the $d$-dimensional Rosenbrock function. In previous releases, it was 2d only and had an additional keyword argument a
.
Update your workflow accordingly.
The $d$-dimensional Rosenbrock banana benchmark function. With $b=100$, minimum is at $f([1, \dots, 1]) = 0$
\[f(x) = \sum_{i=1}^{d-1} \left[b(x_{i+1} - x_i^2)^2 + (x_i - 1)^2 \right]\]
EvoLP.wheeler
— Functionwheeler(x, a=1.5)
The Wheeler's ridge is a 2-d function with a single global minimum in a curved peak. With $a$ (by default at 1.5) $x^* = [1, 1.5]$, with $f(x^*) = -1$.
\[f(x) = - \exp(- (x_1 x_2 - a)^2 - (x_2 - a)^2 )\]