Skip to main content

mathematical optimization - Using StepMonitor/EvaluationMonitor with DifferentialEvolution in NMinimize


My question about tracking the progress of a minimisation/fitting process is two-fold:


1. The first part of my question can be considered a slight duplicate/follow-up of this question. There the OP asked what exactly is passed to the StepMonitor-Option when using the "DifferentialEvolution"-method. The answer by Oleksandr R. states that only the fittest individual of each generation is passed. This is fine by me but when I try the following nothing is passed:


data = BlockRandom[SeedRandom[12345]; 
Table[{x, Exp[-2.3 x/(11 + .4 x + x^2)] + RandomReal[{-.5, .5}]},
{x, RandomReal[{1, 15}, 20]}]];


nlm = Reap@NonlinearModelFit[data, Exp[a x/(b + c x)], {a, b, c}, x,
Method -> {NMinimize, Method -> {"DifferentialEvolution"}},
StepMonitor :> Sow[{a, b, c}]]

compare this to the same expression using EvaluationMonitor instead of StepMonitor.


Question: Is what is passed to/by EvaluationMonitor the fittest individual of the current generation (like Oleksandr R. suggested for StepMonitor which sadly seems not to be accurate any more)


Question 2 Is there a straightforward way to pass the value of the fitness/cost-function of said best individual i.e. the squared differences between estimate and data?



Answer



EvaluationMonitor is going to be called whenever the objective function is being evaluated, that is much more often than StepMonitor.



The reason for not getting any points back is that the StepMonitor specification is not propagated to the NMinimize call. Try the following syntax instead


nlm = Reap @ NonlinearModelFit[data, Exp[a x/(b + c x)], {a, b, c}, x, 
Method -> {NMinimize, StepMonitor :> Sow[{a, b, c}],
Method -> "DifferentialEvolution"}]

For the values of the objective function at these points, one could build the sum of squared residuals by hand, but there is also an internal function that can be used (the added factor of two is because in the default 2-norm case the objective function is 12rr, where r is the residual vector).


obj = Optimization`FindFit`ObjectiveFunction[data, Exp[a x/(b + c x)], {a, b, c}, x];
nlm = Reap @ NonlinearModelFit[data, Exp[a x/(b + c x)], {a, b, c}, x,
Method -> {NMinimize, StepMonitor :> Sow[2 obj[{a, b, c}]],
Method -> "DifferentialEvolution"}]

Comments

Popular posts from this blog

functions - Get leading series expansion term?

Given a function f[x] , I would like to have a function leadingSeries that returns just the leading term in the series around x=0 . For example: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x)] x and leadingSeries[(1/x + 2 + (1 - 1/x^3)/4)/(4 + x)] -(1/(16 x^3)) Is there such a function in Mathematica? Or maybe one can implement it efficiently? EDIT I finally went with the following implementation, based on Carl Woll 's answer: lds[ex_,x_]:=( (ex/.x->(x+O[x]^2))/.SeriesData[U_,Z_,L_List,Mi_,Ma_,De_]:>SeriesData[U,Z,{L[[1]]},Mi,Mi+1,De]//Quiet//Normal) The advantage is, that this one also properly works with functions whose leading term is a constant: lds[Exp[x],x] 1 Answer Update 1 Updated to eliminate SeriesData and to not return additional terms Perhaps you could use: leadingSeries[expr_, x_] := Normal[expr /. x->(x+O[x]^2) /. a_List :> Take[a, 1]] Then for your examples: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x), x] leadingSeries[Exp[x], x] leadingSeries[(1/x + 2 + (1 - 1/x...

mathematical optimization - Minimizing using indices, error: Part::pkspec1: The expression cannot be used as a part specification

I want to use Minimize where the variables to minimize are indices pointing into an array. Here a MWE that hopefully shows what my problem is. vars = u@# & /@ Range[3]; cons = Flatten@ { Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; Minimize[{Total@((vec1[[#]] - vec2[[u[#]]])^2 & /@ Range[1, 3]), cons}, vars, Integers] The error I get: Part::pkspec1: The expression u[1] cannot be used as a part specification. >> Answer Ok, it seems that one can get around Mathematica trying to evaluate vec2[[u[1]]] too early by using the function Indexed[vec2,u[1]] . The working MWE would then look like the following: vars = u@# & /@ Range[3]; cons = Flatten@{ Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; NMinimize[ {Total@((vec1[[#]] - Indexed[vec2, u[#]])^2 & /@ R...

What is and isn't a valid variable specification for Manipulate?

I have an expression whose terms have arguments (representing subscripts), like this: myExpr = A[0] + V[1,T] I would like to put it inside a Manipulate to see its value as I move around the parameters. (The goal is eventually to plot it wrt one of the variables inside.) However, Mathematica complains when I set V[1,T] as a manipulated variable: Manipulate[Evaluate[myExpr], {A[0], 0, 1}, {V[1, T], 0, 1}] (*Manipulate::vsform: Manipulate argument {V[1,T],0,1} does not have the correct form for a variable specification. >> *) As a workaround, if I get rid of the symbol T inside the argument, it works fine: Manipulate[ Evaluate[myExpr /. T -> 15], {A[0], 0, 1}, {V[1, 15], 0, 1}] Why this behavior? Can anyone point me to the documentation that says what counts as a valid variable? And is there a way to get Manpiulate to accept an expression with a symbolic argument as a variable? Investigations I've done so far: I tried using variableQ from this answer , but it says V[1...