5. Optimization#
5.1. What is Optimization#
Optimization refers to the process of finding the best solution from a feasible space, typically by minimizing or maximizing an objective function.
In modeling and simulation, optimization is used to identify input parameters that yield the most desirable output — highest fidelity, lowest cost, or best performance.
An optimization problem should include:
One or more objective functions to minimize or maximize
Decision variables with type, range, and attributes
Constraints (optional)
Optimization can be single-objective or multi-objective depending on the number of objectives.
5.2. Overview of UQPyL.optimization#
The optimization module in UQPyL provides widely used optimization
algorithms, organized into two submodules:
single_objectivemulti_objective
Below is the list of currently implemented algorithms (module actively updated).
5.2.1. Algorithms for single-objective optimization#
Abbreviation |
Full Name |
Optimization Label |
References |
|---|---|---|---|
SCE-UA |
Shuffled Complex Evolution |
Single |
|
ML-SCE-UA |
M&L Shuffled Complex Evolution |
Single |
|
GA |
Genetic Algorithm |
Single |
|
CSA |
Cooperation Search Algorithm |
Single |
|
PSO |
Particle Swarm Optimization |
Single |
|
DE |
Differential Evolution |
Single |
|
ABC |
Artificial Bee Colony |
Single |
|
ASMO |
Adaptive Surrogate Modelling based Optimization |
Single, Surrogate |
|
EGO |
Efficient Global Optimization |
Single, Surrogate |
5.2.2. Algorithms for multi-objective optimization#
Abbreviation |
Full Name |
Optimization Label |
References |
|---|---|---|---|
MOEA/D |
Multi-objective Evolutionary Algorithm based on Decomposition |
Multiple |
|
NSGA-II |
Nondominated Sorting Genetic Algorithm II |
Multiple |
|
NSGA-III |
Nondominated Sorting Genetic Algorithm III |
Multiple |
|
RVEA |
Reference Vector Guided Evolutionary Algorithm |
Multiple |
|
MO-ASMO |
Multi-Objective Adaptive Surrogate Modeling Optimization |
Multiple, Surrogate |
Note
The label Surrogate indicates algorithms designed for computationally expensive optimization. See the Surrogate Model section for details.
All algorithms above inherit from algorithmABC which includes a fixed
run method. Optimization history and results are stored in the Result class.
5.3. API Reference#
5.3.1. Class algorithmABC#
The algorithmABC class defines the base structure of all optimization
algorithms in UQPyL and contains the fixed function run.
5.3.1.1. Constructor#
__init__(...)
5.3.1.1.1. Description#
Initializes an optimization algorithm instance.
Hyper-parameters include:
5.3.1.1.2. Fixed Parameters#
maxFEs (int) Maximum number of function evaluations.
maxIterTimes (int) Maximum number of iterations.
maxTolerateTimes (int) Number of times a sub-threshold improvement can be tolerated.
tolerate (float) Improvement threshold for counting tolerate times.
verboseFlag (bool) Enable detailed logs. Default
True.saveFlag (bool) Save results to
./Result/Data. DefaultFalse.logFlag (bool) Save verbose logs to
./Result/Log. DefaultFalse.
5.3.1.1.3. Other hyper-parameters#
These vary depending on the algorithm.
Example: GA has proC, proM, disC, disM for controlling variation operators.
5.3.2. Methods#
Execute optimization.
Parameters:
problem (Problem) A
Probleminstance.
Returns:
Result — containing optimization history & results.
5.4. Class Result#
Used to store optimization history and results.
5.4.1. Attributes#
bestDecs : Best decision variables
bestTrueDecs : Best decoded decisions (int/discrete processed)
bestObjs : Best objective values
bestTrueObjs : Best true objectives (if optType='max')
bestCons : Best constraint values
bestMetric : Best performance metric for multi-objective
bestFeasible : Whether best solution satisfies constraints
appearFEs : FE count of best solution
appearIters : Iteration of best solution
historyBestDecs : Best decisions at different FE levels
historyBestTrueDecs : Decoded decisions across FE history
historyBestObjs : Best objectives at FE levels
historyBestTrueObjs : Best true objectives history
historyBestCons : Constraint values history
historyBestMetrics : Multi-objective metric history
historyDecs : All evaluated decision variables
historyObjs : Objective values history
historyCons : Constraint values history
5.5. How to run optimization#
5.5.1. Single-objective optimization#
from UQPyL.problems.single_objective import Sphere
from UQPyL.optimization.single_objective import GA
# Sphere function in 15D
problem = Sphere(nInput=15, ub=100, lb=-100)
ga = GA(
nPop=50,
maxFEs=50000,
verboseFlag=True,
saveFlag=True,
logFlag=True
)
res = ga.run(problem=problem)
print(res.bestDecs) # or res.bestTrueDecs
print(res.bestObjs) # or res.bestTrueObjs
5.5.2. Multi-objective optimization#
from UQPyL.problem.multi_objective import ZDT1
from UQPyL.optimization import NSGAII
problem = ZDT1()
nsgaii = NSGAII(
nPop=50,
maxFEs=20000,
verboseFlag=True,
saveFlag=True,
logFlag=True
)
res = nsgaii.run(problem=problem)
print(res.bestDecs)
print(res.bestObjs) # shape (N, 2)
5.6. Checking optimization results (.hdf)#
When saveFlag or logFlag is set to True, results are automatically
stored in:
Result/DataResult/Log
The output file naming format is:
A_B_C_D_I.hdf
Where:
A – Algorithm name
B – Problem name
C – Problem dimension
D – Number of objectives
I – Run index
Example:
GA_Sphere_D15_M1_1.hdf
means:
Algorithm = GA
Problem = Sphere
Dimension = 15
Objective count = 1
Run index = 1