English Français

Documentation / Manuel utilisateur en C++ : PDF version

XIV.10. Macros MetaModelOptim

XIV.10. Macros MetaModelOptim

XIV.10.1.  Macro "metamodoptEgoHimmel.C"

XIV.10.1.1. Objective

The objective of this macro is to optimize a function using the Efficient Global Optimization algorithm (EGO). EGO is well suited for problem when solution evaluations are expensive (efficient) and when problem has local minima (global). In our example, user function is artificially slown down and evaluations are parallelized with threads to point out the standard context of its use.

The user function is inspired by an academic function, Himmelblau. It is a multimodal function used to point out the global search that is realized. The problem have 3 parameters and 4 optimal solutions.

XIV.10.1.2. Macro Uranie>

This macro follows the structure of an reoptimizer macro (see previous chapter) and we take a quick look at generic part. Only specific codes are explained in more details.


/** user choice **/
#define LENT 123456789 // slow down

#define NK 40 // minimal evaluation number for initialize a kriging model
#define NR 8 // resource number to be used
#define NC 300 // maximum evaluation number
#define NP 3 // parameter number


/* user function  */
int userfun(double* in, double *out)
{
    double him, ret, tmp;
    int i;

    // himmelblau
    ret = 0;
#ifdef LENT
    for (int j=0; j<LENT; j++)
#endif
    {
        him = 0;
        tmp = in[0]*in[0] - in[1] - 6;
        him += tmp*tmp;
        for (i=1; i<NP; i++) {
            tmp = (i%2) ? in[i]*in[i] - in[i-1] - 6 : in[i] - in[i-1];
            him += tmp*tmp;
        }
        if (him > ret) ret = him;
    }
    
    out[0] = ret;
    return 1;
}

void metamodoptEgoHimmel()
{
    int i;
    
    // input and aoutput variables 
    URANIE::DataServer::TUniformDistribution x1("x1", -8., 8.);
    URANIE::DataServer::TUniformDistribution x2("x2", -8., 8.);
    URANIE::DataServer::TUniformDistribution x3("x3", -8., 8.);
    URANIE::DataServer::TAttribute *inatt[4] = {&x1, &x2, &x3, NULL};
    URANIE::DataServer::TAttribute y("y");

    // user fun
    URANIE::Relauncher::TCJitEval fun(&userfun);
    for (i=0; inatt[i] != NULL; i++) fun.addInput(inatt[i]);
    fun.addOutput(&y);

    // runner
    URANIE::Relauncher::TThreadedRun run(&fun, NR+1);
    run.startSlave();
    if (run.onMaster()) {
        // tds
        URANIE::DataServer::TDataServer tds("tds", "ego tds");
        tds.keepFinalTuple(kFALSE);
        for (i=0; inatt[i] != NULL; i++) tds.addAttribute(inatt[i]);

        tds.fileDataRead("lhs3.dat", kFALSE, kTRUE);

        // meta modele to use
        URANIE::MetaModelOptim::TEgoKBModeler egomod;
        egomod.setModel("matern7/2", "const", 1e-8);
        egomod.setSolver("ML", "Bobyqa", 300, 800);

        // ei optimiser
        URANIE::MetaModelOptim::TEgoHjDynSolver hjsolv;
        hjsolv.setSize(128, 32);

        // master
        URANIE::MetaModelOptim::TEGO egosolv(&tds, &run);
        egosolv.setSize(NK, NC);
        egosolv.setModeler(&egomod);
        egosolv.setSolver(&hjsolv);
        egosolv.addObjective(&y);
        
        // run master
        egosolv.solverLoop();
 

        // results
        tds.exportData("egoC.dat");
        run.stopSlave();
    }
}
 

At beginning, some cpp macro value are defined.

  • The LENT value is used to slow down evaluation time.

  • NR value defines the number of threads that are used for evaluation.

  • NP, NK and NC values define respectively the number of problem parameters, the number of evalutions needed before creating a first surrogate model and the maximum number of evaluation allowed.

The userfun function defines the user function, and metamodoptEgoHimmmel the optimisation to realize. The latter follows usual structure: It defines the problem variables, a TCjitEval to describes the user function, a TThreadRun to use thread parallelism. In the master block, a working TDataServer is defined, and optimization classes.

Notice that the input variables are defined with a TUniformDistribution (not just a TAttribute with minimum and maximum values). It may be unuseful in this case where the TDataServer is filled with fileDataRead but otherwise, a sampler is run implicitely.

EGO uses two different solvers : one for surrogate model construction, one for EI optimisation :

  • for surrogate model only one class is provided (TKBModeler). It allows to configure which kriging model to use (setModel), and how it is constructed (setSolver)

  • for maximizing EI, a TEgoHjDynSolver is defined, meaning it uses dynamic optimization with the HJMA algorithm. Two parameters are given, first one configuring the initial search, and the second the following one using previous results.

These solvers are passed to the TEGO class with a dedicated method. Currently, EGO runs in verbose modes.

The resulting TDataServer is filled with all evaluated solutions. In our case of a multi modal problem, keeping just the best solution is not appropriate. A postprocessing is needed to get best solutions.

/language/en