Skip to main content

numerics - Is it possible to use the LevenbergMarquardt algorithm for fitting a black-box residual function?


I have a black-box multiargument multiparametric function of the type SRD[dataPoint_List,params_List] which accepts experimental data along with the parameters of the model and returns a vector with the first element being the squared relative deviation (SRD) between the computed model value and experimental value for the given data point and the rest of the elements being the gradient of the SRD in the given point with respect to params. I wish to minimize the sum Sum[First[SRD[dataPoint, params]], {dataPoint, dataPoints}] with respect to params using the Levenberg-Marquardt algorithm and the computed gradient of SRD with respect to params (equal to Sum[Rest[SRD[dataPoint, params]], {dataPoint, dataPoints}]). What is the best way to do this in Mathematica?



Answer



Here is the solution based on Faysal's code:


steps=0; 
FindMinimum[Null,

{optimVariables, initialGuess}\[Transpose],
Method -> {"LevenbergMarquardt",
"Residual" -> Sqrt[2] residualVector[optimVariables],
"Jacobian" -> {Sqrt[2] jacobianMatrix[optimVariables], EvaluationMonitor :> ++steps},
"StepControl" -> {"TrustRegion", "StartingScaledStepSize" -> 1/1000, "MaxScaledStepSize" -> 1/10, "AcceptableStepRatio" -> 1/3}}]

Note that it is recommended to use exact numbers for the parameters of the "TrustRegion" method because these parameters are used inside of the algorithm without any check for consistency with WorkingPrecision. I should also note that the actual residual vector and jacobian must be multiplied by Sqrt[2] for having FindMinimum returning the minimum equal to


residualVector[optimVariables].residualVector[optimVariables]

and not to



residualVector[optimVariables].residualVector[optimVariables]/2

as it is by default.


The jacobian may be calculated automatically by the following code:


jacobianMatrix[_] = D[residualVector[optimVariables], {optimVariables}]

One can restrict the jacobian to be evaluated for numerical values only by defining it as:


jacobianMatrix[_List?(VectorQ[#, NumberQ] &)] = 
D[residualVector[optimVariables], {optimVariables}]

Comments

Popular posts from this blog

mathematical optimization - Minimizing using indices, error: Part::pkspec1: The expression cannot be used as a part specification

I want to use Minimize where the variables to minimize are indices pointing into an array. Here a MWE that hopefully shows what my problem is. vars = u@# & /@ Range[3]; cons = Flatten@ { Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; Minimize[{Total@((vec1[[#]] - vec2[[u[#]]])^2 & /@ Range[1, 3]), cons}, vars, Integers] The error I get: Part::pkspec1: The expression u[1] cannot be used as a part specification. >> Answer Ok, it seems that one can get around Mathematica trying to evaluate vec2[[u[1]]] too early by using the function Indexed[vec2,u[1]] . The working MWE would then look like the following: vars = u@# & /@ Range[3]; cons = Flatten@{ Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; NMinimize[ {Total@((vec1[[#]] - Indexed[vec2, u[#]])^2 & /@ R...

functions - Get leading series expansion term?

Given a function f[x] , I would like to have a function leadingSeries that returns just the leading term in the series around x=0 . For example: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x)] x and leadingSeries[(1/x + 2 + (1 - 1/x^3)/4)/(4 + x)] -(1/(16 x^3)) Is there such a function in Mathematica? Or maybe one can implement it efficiently? EDIT I finally went with the following implementation, based on Carl Woll 's answer: lds[ex_,x_]:=( (ex/.x->(x+O[x]^2))/.SeriesData[U_,Z_,L_List,Mi_,Ma_,De_]:>SeriesData[U,Z,{L[[1]]},Mi,Mi+1,De]//Quiet//Normal) The advantage is, that this one also properly works with functions whose leading term is a constant: lds[Exp[x],x] 1 Answer Update 1 Updated to eliminate SeriesData and to not return additional terms Perhaps you could use: leadingSeries[expr_, x_] := Normal[expr /. x->(x+O[x]^2) /. a_List :> Take[a, 1]] Then for your examples: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x), x] leadingSeries[Exp[x], x] leadingSeries[(1/x + 2 + (1 - 1/x...

What is and isn't a valid variable specification for Manipulate?

I have an expression whose terms have arguments (representing subscripts), like this: myExpr = A[0] + V[1,T] I would like to put it inside a Manipulate to see its value as I move around the parameters. (The goal is eventually to plot it wrt one of the variables inside.) However, Mathematica complains when I set V[1,T] as a manipulated variable: Manipulate[Evaluate[myExpr], {A[0], 0, 1}, {V[1, T], 0, 1}] (*Manipulate::vsform: Manipulate argument {V[1,T],0,1} does not have the correct form for a variable specification. >> *) As a workaround, if I get rid of the symbol T inside the argument, it works fine: Manipulate[ Evaluate[myExpr /. T -> 15], {A[0], 0, 1}, {V[1, 15], 0, 1}] Why this behavior? Can anyone point me to the documentation that says what counts as a valid variable? And is there a way to get Manpiulate to accept an expression with a symbolic argument as a variable? Investigations I've done so far: I tried using variableQ from this answer , but it says V[1...