-
-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Docs for MOO #823
base: master
Are you sure you want to change the base?
Docs for MOO #823
Changes from 14 commits
2646d29
79a297b
1394ec4
5af0e2b
37aa52b
81f5e8a
238dcbe
9198cb4
be6388b
5b3a0b5
6a1dceb
c9d892f
564f53a
b1fe7da
89abf73
5aa1289
fb0181b
6d4c6ba
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -67,3 +67,21 @@ prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0, -1.0], ub = [1.0, | |
sol = solve(prob, BBO_adaptive_de_rand_1_bin_radiuslimited(), maxiters = 100000, | ||
maxtime = 1000.0) | ||
``` | ||
|
||
## Multi-objective optimization | ||
The optimizer for Multi-Objective Optimization is `BBO_borg_moea()`. Your objective function should return a tuple of the objective values and you should indicate the fitness scheme to be (typically) Pareto fitness and specify the number of objectives. Otherwise, the use is similar, here is an example: | ||
|
||
```@example MOO-BBO | ||
using OptimizationBBO, Optimization, BlackBoxOptim | ||
using SciMLBase: MultiObjectiveOptimizationFunction | ||
u0 = [0.25, 0.25] | ||
opt = OptimizationBBO.BBO_borg_moea() | ||
function multi_obj_func(x, p) | ||
f1 = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2 # Rosenbrock function | ||
f2 = -20.0 * exp(-0.2 * sqrt(0.5 * (x[1]^2 + x[2]^2))) - exp(0.5 * (cos(2π * x[1]) + cos(2π * x[2]))) + exp(1) + 20.0 # Ackley function | ||
return (f1, f2) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. you didn't actually change it though? |
||
end | ||
mof = MultiObjectiveOptimizationFunction(multi_obj_func) | ||
prob = Optimization.OptimizationProblem(mof, u0; lb = [0.0, 0.0], ub = [2.0, 2.0]) | ||
sol = solve(prob, opt, NumDimensions=2, FitnessScheme=ParetoFitnessScheme{2}(is_minimizing=true)) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
|
||
``` |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -41,3 +41,20 @@ f = OptimizationFunction(rosenbrock) | |
prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0, -1.0], ub = [1.0, 1.0]) | ||
sol = solve(prob, Evolutionary.CMAES(μ = 40, λ = 100)) | ||
``` | ||
|
||
## Multi-objective optimization | ||
The Rosenbrock and Ackley functions can be optimized using the `Evolutionary.NSGA2()` as follows: | ||
|
||
```@example MOO-Evolutionary | ||
using Optimization, OptimizationEvolutionary, Evolutionary | ||
function func(x, p=nothing)::Vector{Float64} | ||
f1 = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2 # Rosenbrock function | ||
f2 = -20.0 * exp(-0.2 * sqrt(0.5 * (x[1]^2 + x[2]^2))) - exp(0.5 * (cos(2π * x[1]) + cos(2π * x[2]))) + exp(1) + 20.0 # Ackley function | ||
return [f1, f2] | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. it's an array now? Is there an in-place form? |
||
end | ||
initial_guess = [1.0, 1.0] | ||
obj_func = MultiObjectiveOptimizationFunction(func) | ||
algorithm = OptimizationEvolutionary.NSGA2() | ||
problem = OptimizationProblem(obj_func, initial_guess) | ||
result = solve(problem, algorithm) | ||
``` |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -70,3 +70,54 @@ sol = solve(prob, ECA(), use_initial = true, maxiters = 100000, maxtime = 1000.0 | |
### With Constraint Equations | ||
|
||
While `Metaheuristics.jl` supports such constraints, `Optimization.jl` currently does not relay these constraints. | ||
|
||
|
||
## Multi-objective optimization | ||
The zdt1 functions can be optimized using the `Metaheuristics.jl` as follows: | ||
|
||
```@example MOO-Metaheuristics | ||
using Optimization, OptimizationEvolutionary,OptimizationMetaheuristics, Metaheuristics | ||
function zdt1(x) | ||
f1 = x[1] | ||
g = 1 + 9 * mean(x[2:end]) | ||
h = 1 - sqrt(f1 / g) | ||
f2 = g * h | ||
# In this example, we have no constraints | ||
gx = [0.0] # Inequality constraints (not used) | ||
hx = [0.0] # Equality constraints (not used) | ||
return [f1, f2], gx, hx | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is now a third API? |
||
end | ||
multi_obj_fun = MultiObjectiveOptimizationFunction((x, p) -> zdt1(x)) | ||
|
||
# Define the problem bounds | ||
lower_bounds = [0.0, 0.0, 0.0] | ||
upper_bounds = [1.0, 1.0, 1.0] | ||
|
||
# Define the initial guess | ||
initial_guess = [0.5, 0.5, 0.5] | ||
|
||
# Create the optimization problem | ||
prob = OptimizationProblem(multi_obj_fun, initial_guess; lb = lower_bounds, ub = upper_bounds) | ||
|
||
nobjectives = 2 | ||
npartitions = 100 | ||
|
||
# reference points (Das and Dennis's method) | ||
weights = Metaheuristics.gen_ref_dirs(nobjectives, npartitions) | ||
|
||
# Choose the algorithm as required. | ||
alg1 = Metaheuristics.NSGA2() | ||
alg2 = Metaheuristics.NSGA3() | ||
alg3 = Metaheuristics.SPEA2() | ||
alg4 = Metaheuristics.CCMO(NSGA2(N=100, p_m=0.001)) | ||
alg5 = Metaheuristics.MOEAD_DE(weights, options=Options(debug=false, iterations = 250)) | ||
alg6 = Metaheuristics.SMS_EMOA() | ||
|
||
# Solve the problem | ||
sol1 = solve(prob, alg1; maxiters = 100, use_initial = true) | ||
sol2 = solve(prob, alg2; maxiters = 100, use_initial = true) | ||
sol3 = solve(prob, alg3; maxiters = 100, use_initial = true) | ||
sol4 = solve(prob, alg4) | ||
sol5 = solve(prob, alg5; maxiters = 100, use_initial = true) | ||
sol6 = solve(prob, alg6; maxiters = 100, use_initial = true) | ||
``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why a tuple? That is not going to scale well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah corrected that mistake to vector, the struct uses a vector of objective functions.