10.2.1. Introduction

EGO [JSW98] makes a global search. As a genetic algorithm, it needs an adequate numbers of initial evaluated items to initiate its search: in our case, to be able to construct a sufficiently pertinent model. After this first phase, it builds a surrogate model, and then loops on updating the model with new evaluations, and on searching a new promising solution to evaluate using this model.

For its surrogate model, EGO uses kriging models which provide, for estimated points, a prediction value and an associated variance. EGO defines an objective, the expected improvement, which takes both of them into account and provides a trade-off between a good estimation value and a large uncertainty.

Because of its efficiency in term of evaluation number, this kind of algorithm is well suited when evaluations are expensive to compute. EGO algorithm is expensive: both construction of the surrogate model and search of the next attractive point are complex optimization problems, and are done many times, slowing down the problem resolution.

Extensions to constraints and/or to multi objective should come later in Uranie.