mathematical optimization - Using StepMonitor/EvaluationMonitor with DifferentialEvolution in NMinimize
My question about tracking the progress of a minimisation/fitting process is two-fold:
1. The first part of my question can be considered a slight duplicate/follow-up of this question. There the OP asked what exactly is passed to the StepMonitor-Option when using the "DifferentialEvolution"-method. The answer by Oleksandr R. states that only the fittest individual of each generation is passed. This is fine by me but when I try the following nothing is passed:
data = BlockRandom[SeedRandom[12345];
Table[{x, Exp[-2.3 x/(11 + .4 x + x^2)] + RandomReal[{-.5, .5}]},
{x, RandomReal[{1, 15}, 20]}]];
nlm = Reap@NonlinearModelFit[data, Exp[a x/(b + c x)], {a, b, c}, x,
Method -> {NMinimize, Method -> {"DifferentialEvolution"}},
StepMonitor :> Sow[{a, b, c}]]
compare this to the same expression using EvaluationMonitor instead of StepMonitor.
Question: Is what is passed to/by EvaluationMonitor the fittest individual of the current generation (like Oleksandr R. suggested for StepMonitor which sadly seems not to be accurate any more)
Question 2 Is there a straightforward way to pass the value of the fitness/cost-function of said best individual i.e. the squared differences between estimate and data?
Answer
EvaluationMonitor is going to be called whenever the objective function is being evaluated, that is much more often than StepMonitor.
The reason for not getting any points back is that the StepMonitor specification is not propagated to the NMinimize call. Try the following syntax instead
nlm = Reap @ NonlinearModelFit[data, Exp[a x/(b + c x)], {a, b, c}, x,
Method -> {NMinimize, StepMonitor :> Sow[{a, b, c}],
Method -> "DifferentialEvolution"}]
For the values of the objective function at these points, one could build the sum of squared residuals by hand, but there is also an internal function that can be used (the added factor of two is because in the default 2-norm case the objective function is $\frac12 \bf r \cdot \bf r$, where $\bf r$ is the residual vector).
obj = Optimization`FindFit`ObjectiveFunction[data, Exp[a x/(b + c x)], {a, b, c}, x];
nlm = Reap @ NonlinearModelFit[data, Exp[a x/(b + c x)], {a, b, c}, x,
Method -> {NMinimize, StepMonitor :> Sow[2 obj[{a, b, c}]],
Method -> "DifferentialEvolution"}]
Comments
Post a Comment