--- myst: substitutions: macro: python: 1-5,13-73,78- cpp: 1-85,92-93 sentence1: python: "" cpp: "user function is artificially slown down and" sentence2: python: "5 parameters and 8" cpp: "3 parameters and 4" sentence3: python: "constant values" cpp: "cpp macro value" sentence4: python: "`cpus`" cpp: "`NR`" sentence5: python: "`krigingStep` and `optimStop`" cpp: "`NP`, `NK` and `NC`" sentence6: python: "" cpp: "problem parameters, the number of" sentence7: python: "`ego_test`" cpp: "`metamodoptEgoHimmmel`" sentence8: python: "`TPythonEval`" cpp: "`TCjitEval`" sentence9: python: | The {{tds}} is empty, and a sampler is run implicitely to be able to create a first {{surmod}}. It may be unuseful when the {{tds}} is filled by the `fileDataRead` method. cpp: | It may be unuseful in this case where the {{tds}} is filled with `fileDataRead` but otherwise, a sampler is run implicitely. --- # Macro "**metamodoptEgoHimmel.{{extension}}**" ## Objective The objective of this macro is to optimize a function using the **Efficient Global Optimization** algorithm (EGO). EGO is well suited for problem when solution evaluations are expensive (efficient) and when problem has local minima (global). In our example, {{sentence1[language]}} evaluations are parallelized with threads to point out the standard context of its use. The user function is inspired by an academic function, Himmelblau. It is a multimodal function used to point out the global search that is realized. The problem have {{sentence2[language]}} optimal solutions. ## Macro {{uranie}} This macro follows the structure of an reoptimizer macro (see previous chapter) and we take a quick look at generic part. Only specific codes are explained in more details. {{ "```{literalinclude} " + parent_dir + "/roottest/uranie/doc/metaModelOptim/use_cases/" + language + "/metamodoptEgoHimmel." + extension + "\n" + ":language: " + language + "\n" + ":lines: " + macro[language] + "\n" + "```" }} At beginning, some {{sentence3[language]}} are defined. ```{only} cpp - The `LENT` value is used to slow down evaluation time. ``` - {{sentence4[language]}} value defines the number of threads that are used for evaluation. - {{sentence5[language]}} values define respectively the number of {{sentence6[language]}} evalutions needed before creating a first {{surmod}} and the maximum number of evaluation allowed. The `userfun` function defines the user function, and {{sentence7[language]}} the optimisation to realize. The latter follows usual structure: It defines the problem variables, a {{sentence8[language]}} to describes the user function, a `TThreadRun` to use thread parallelism. In the master block, a working {{tds}} is defined, and optimization classes. Notice that the input variables are defined with a `TUniformDistribution` (not just a `TAttribute` with minimum and maximum values). {{sentence9[language]}} EGO uses two different solvers : one for {{surmod}} construction, one for EI optimisation : - for {{surmod}} only one class is provided (`TKBModeler`). It allows to configure which kriging model to use (`setModel`), and how it is constructed (`setSolver`) - for maximizing EI, a `TEgoHjDynSolver` is defined, meaning it uses dynamic optimization with the HJMA algorithm. Two parameters are given, first one configuring the initial search, and the second the following one using previous results. These solvers are passed to the `TEGO` class with a dedicated method. Currently, EGO runs in verbose modes. The resulting {{tds}} is filled with all evaluated solutions. In our case of a multi modal problem, keeping just the best solution is not appropriate. A postprocessing is needed to get best solutions.