Metaheuristics.DE
— TypeDE(;
++============================+
diff --git a/dev/algorithms/index.html b/dev/algorithms/index.html index acd2fd6..2c469cb 100644 --- a/dev/algorithms/index.html +++ b/dev/algorithms/index.html @@ -34,7 +34,7 @@ minimizer: [-0.00017150889316537758, -0.007955828028420616, 0.022538733289139145] f calls: 30000 total time: 0.1334 s -+============================+source
DE is an evolutionary algorithm based on vector differences. See [2] for more details.
Metaheuristics.DE
— TypeDE(;
++============================+
DE is an evolutionary algorithm based on vector differences. See [2] for more details.
Metaheuristics.DE
— TypeDE(;
N = 0,
F = 1.0,
CR = 0.5,
@@ -60,7 +60,7 @@
minimizer: [3.2777877981303293e-13, 3.7650459509488005e-13, -7.871487597385812e-13]
f calls: 30000
total time: 0.0319 s
-+============================+
PSO is a population-based optimization technique inspired by the motion of bird flocks and schooling fish by [3].
Metaheuristics.PSO
— TypePSO(;
++============================+
PSO is a population-based optimization technique inspired by the motion of bird flocks and schooling fish by [3].
Metaheuristics.PSO
— TypePSO(;
N = 0,
C1 = 2.0,
C2 = 2.0,
@@ -86,7 +86,7 @@
minimizer: [-3.055334698085433e-20, -8.666986835846171e-21, -3.8118413472544027e-20]
f calls: 30000
total time: 0.1365 s
-+============================+
A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm by [4].
Metaheuristics.ABC
— TypeABC(;
++============================+
A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm by [4].
Metaheuristics.ABC
— TypeABC(;
N = 50,
Ne = div(N+1, 2),
No = div(N+1, 2),
@@ -112,7 +112,7 @@
minimizer: [8.257485723496422e-5, 0.0002852795196258074, -3.5620824723352315e-5]
f calls: 30039
total time: 0.0432 s
-+============================+
Multiobjective optimization problems with complicated Pareto sets by [5].
Metaheuristics.MOEAD_DE
— TypeMOEAD_DE(weights ;
++============================+
Multiobjective optimization problems with complicated Pareto sets by [5].
Metaheuristics.MOEAD_DE
— TypeMOEAD_DE(weights ;
F = 0.5,
CR = 1.0,
λ = Array{Vector{Float64}}[], # ref. points
@@ -151,7 +151,7 @@
status_moead = optimize(f, bounds, moead_de)
# show results
-display(status_moead)
Chaotic gravitational constants for the gravitational search algorithm by [6]
Metaheuristics.CGSA
— TypeCGSA(;
+display(status_moead)
Chaotic gravitational constants for the gravitational search algorithm by [6]
Metaheuristics.CGSA
— TypeCGSA(;
N::Int = 30,
chValueInitial::Real = 20,
chaosIndex::Real = 9,
@@ -181,7 +181,7 @@
minimizer: [-8.8507563788141e-6, -1.3050111801923072e-5, 2.7688577445980026e-5]
f calls: 40000
total time: 1.0323 s
-+============================+
Physics-inspired algorithm for optimization by [7].
Metaheuristics.SA
— Type SA(;
++============================+
Physics-inspired algorithm for optimization by [7].
Metaheuristics.SA
— Type SA(;
x_initial::Vector = zeros(0),
N::Int = 500,
tol_fun::Real= 1e-4,
@@ -206,7 +206,7 @@
minimizer: [4.4638292404181215e-35, -1.738939846089388e-36, -9.542441152683457e-37]
f calls: 29802
total time: 0.0965 s
-+============================+
The Whale Optimization Algorithm inspired by humpback whales proposed in [8].
Metaheuristics.WOA
— TypeWOA(;N = 30, information = Information(), options = Options())
Parameters for the Whale Optimization Algorithm. N
is the population size (number of whales).
Example
julia> f(x) = sum(x.^2)
++============================+
The Whale Optimization Algorithm inspired by humpback whales proposed in [8].
Metaheuristics.WOA
— TypeWOA(;N = 30, information = Information(), options = Options())
Parameters for the Whale Optimization Algorithm. N
is the population size (number of whales).
Example
julia> f(x) = sum(x.^2)
f (generic function with 1 method)
julia> optimize(f, [-1 -1 -1; 1 1 1.0], WOA())
@@ -226,7 +226,7 @@
f calls: 50000
total time: 0.0588 s
+============================+
-
A fast and elitist multiobjective genetic algorithm: NSGA-II by [9].
Metaheuristics.NSGA2
— TypeNSGA2(;
+
A fast and elitist multiobjective genetic algorithm: NSGA-II by [9].
Metaheuristics.NSGA2
— TypeNSGA2(;
N = 100,
η_cr = 20,
p_cr = 0.9,
@@ -254,7 +254,7 @@
status = optimize(f, bounds, nsga2)
# show results
-display(status)
An Evolutionary Many-Objective Optimization Algorithm Using Reference-Point-Based Nondominated Sorting Approach, Part I: Solving Problems With Box Constraints by [10].
Metaheuristics.NSGA3
— TypeNSGA3(;
+display(status)
An Evolutionary Many-Objective Optimization Algorithm Using Reference-Point-Based Nondominated Sorting Approach, Part I: Solving Problems With Box Constraints by [10].
Metaheuristics.NSGA3
— TypeNSGA3(;
N = 100,
η_cr = 20,
p_cr = 0.9,
@@ -278,7 +278,7 @@
status = optimize(f, bounds, nsga3)
# show results
-display(status)
An EMO algorithm using the hypervolume measure as a selection criterion by [11].
Metaheuristics.SMS_EMOA
— TypeSMS_EMOA(;
+display(status)
An EMO algorithm using the hypervolume measure as a selection criterion by [11].
Metaheuristics.SMS_EMOA
— TypeSMS_EMOA(;
N = 100,
η_cr = 20,
p_cr = 0.9,
@@ -307,7 +307,7 @@
status = optimize(f, bounds, sms_emoa)
# show results
-display(status)
Improved strength Pareto evolutionary algorithm by [12].
Metaheuristics.SPEA2
— TypeSPEA2(;
+display(status)
Improved strength Pareto evolutionary algorithm by [12].
Metaheuristics.SPEA2
— TypeSPEA2(;
N = 100,
η_cr = 20,
p_cr = 0.9,
@@ -335,7 +335,7 @@
status = optimize(f, bounds, nsga2)
# show results
-display(status)
Bilevel Centers Algorithm has been proposed to solve bilevel optimization problems. See BilevelHeuristics.BCA
for details.
Machine-coded Compact Genetic Algorithms for real-valued optimization problems by [13].
Metaheuristics.MCCGA
— TypeMCCGA(;N, maxsamples)
Parameters:
N
population size. Default is 100.maxsamples
maximum number of samples. Default is 10000.Description
MCCGA method implements the Machine-coded genetic algorithms for real valued optimization problems. The algorithm is based on the concept of a compact genetic algorithm but with a machine-coded representation using IEEE-754 floating point encoding standard. In the first stage of the algorithm, maxsamples
number of samples are generated within the range of function domain. This process is required to obtain a vector of probabilities for each single bit of the IEEE-754 representation. In classical CGAs, the initial vector of probabilities is generated using the constant probability of 0.5 whereas in MCCGA, the probability of ith bit having a value of 1 depends on the function domain. The second step performs a CGA search but with IEEE-754 bits again. Since CGA does not use a population of solutions but a single vector of probabilities, the parameter N
does not really mean number of solutions. Instead, it means the amount of mutation in each iteration, e.g. 1/N. In the last stage, a local search is performed for fine-tuning. In this implementation, Hooke & Jeeves direct search algorithm is used.
References
Example
+display(status)
Bilevel Centers Algorithm has been proposed to solve bilevel optimization problems. See BilevelHeuristics.BCA
for details.
Machine-coded Compact Genetic Algorithms for real-valued optimization problems by [13].
Metaheuristics.MCCGA
— TypeMCCGA(;N, maxsamples)
Parameters:
N
population size. Default is 100.maxsamples
maximum number of samples. Default is 10000.Description
MCCGA method implements the Machine-coded genetic algorithms for real valued optimization problems. The algorithm is based on the concept of a compact genetic algorithm but with a machine-coded representation using IEEE-754 floating point encoding standard. In the first stage of the algorithm, maxsamples
number of samples are generated within the range of function domain. This process is required to obtain a vector of probabilities for each single bit of the IEEE-754 representation. In classical CGAs, the initial vector of probabilities is generated using the constant probability of 0.5 whereas in MCCGA, the probability of ith bit having a value of 1 depends on the function domain. The second step performs a CGA search but with IEEE-754 bits again. Since CGA does not use a population of solutions but a single vector of probabilities, the parameter N
does not really mean number of solutions. Instead, it means the amount of mutation in each iteration, e.g. 1/N. In the last stage, a local search is performed for fine-tuning. In this implementation, Hooke & Jeeves direct search algorithm is used.
References
Example
julia> f, bounds, solutions = Metaheuristics.TestProblems.rastrigin();
julia> result = optimize(f, bounds, MCCGA())
@@ -363,7 +363,7 @@
f calls: 6012
total time: 1.5233 s
stop reason: Other stopping criteria.
-+============================+
Metaheuristics.GA
— TypeGA(;
++============================+
Metaheuristics.GA
— TypeGA(;
N = 100,
p_mutation = 1e-5,
p_crossover = 0.5,
@@ -441,7 +441,7 @@
f calls: 49900
total time: 0.5775 s
stop reason: Maximum number of iterations exceeded.
-+============================+
A Coevolutionary Framework for Constrained Multiobjective Optimization Problems proposed by [14].
Metaheuristics.CCMO
— TypeCCMO(base_optimizer; infromation, options)
Parameters for CCMO algorithm. base_algorithm
only supports NSGA2()
.
A feasible solution is such that g_i(x) ≤ 0 and h_j(x) = 0
.
Example
julia> f, bounds, pf = Metaheuristics.TestProblems.MTP();
++============================+
A Coevolutionary Framework for Constrained Multiobjective Optimization Problems proposed by [14].
Metaheuristics.CCMO
— TypeCCMO(base_optimizer; infromation, options)
Parameters for CCMO algorithm. base_algorithm
only supports NSGA2()
.
A feasible solution is such that g_i(x) ≤ 0 and h_j(x) = 0
.
Example
julia> f, bounds, pf = Metaheuristics.TestProblems.MTP();
julia> ccmo = CCMO(NSGA2(N=100, p_m=0.001));
@@ -493,8 +493,8 @@
feasibles: 100 / 100 in final population
total time: 7.0616 s
stop reason: Maximum number of iterations exceeded.
-+============================+
$\varepsilon$ Constrained Differential Evolution with Gradient-Based Mutation and Feasible Elites by [15].
Gradient mutation is not implemented here.
Metaheuristics.εDE
— TypeεDE(cp = 5, DE_kargs...)
-epsilonDE(cp = 5, DE_kargs...)
Parameters for ε Differential Evolution for constrained optimization.
See DE for more details about DE parameters (DE_kargs
).
This implementation is not implementing the gradient-based repair method.
Biased Random Key Genetic Algorithm by [16].
Metaheuristics.BRKGA
— FunctionBRKGA(num_elites = 20, num_mutants = 10, num_offsprings = 70, bias = 0.7)
Biased Random Key Genetic Algorithm (BRKGA).
Example
julia> target_perm = collect(reverse(1:10))
++============================+
$\varepsilon$ Constrained Differential Evolution with Gradient-Based Mutation and Feasible Elites by [15].
Gradient mutation is not implemented here.
Metaheuristics.εDE
— TypeεDE(cp = 5, DE_kargs...)
+epsilonDE(cp = 5, DE_kargs...)
Parameters for ε Differential Evolution for constrained optimization.
See DE for more details about DE parameters (DE_kargs
).
This implementation is not implementing the gradient-based repair method.
Biased Random Key Genetic Algorithm by [16].
Metaheuristics.BRKGA
— FunctionBRKGA(num_elites = 20, num_mutants = 10, num_offsprings = 70, bias = 0.7)
Biased Random Key Genetic Algorithm (BRKGA).
Example
julia> target_perm = collect(reverse(1:10))
10-element Vector{Int64}:
10
9
@@ -532,4 +532,4 @@
4
3
2
- 1
Settings
This document was generated with Documenter.jl version 0.27.25 on Monday 4 March 2024. Using Julia version 1.7.3.