Skip to main content

performance tuning - Considerations when determining efficiency of Mathematica code


I have two segments of code that do the same thing and I want to determine the which is more efficient.


What are the considerations when determining efficiency of Mathematica code?



  • Correctness/Equality of code segments

  • AbsoluteTiming vs Timing ... Why?

  • Clearing the cache


  • Memory footprint (speed vs size) ... Any suggestions on how to measure this?

  • More?


Any useful packages out there to assist in this?




Hypothetical Code Segment 1


numbers = {}; For[i = 0, i < 100, i++, AppendTo[numbers, i]]; numbers



Hypothetical Code Segment 2



Range[0, 99]



Testing Code


(* Test Equality *)
Print["Equality: ",
numbers = {}; For[i = 0, i < 100, i++, AppendTo[numbers, i]]; numbers ==
Range[0, 99]]

(* Timing Comparison *)

iterations = 10000;

times = Map[{
AbsoluteTiming[
numbers = {}; For[i = 0, i < 100, i++, AppendTo[numbers, i]]; numbers
][[1]],
AbsoluteTiming[
Range[0, 99]
][[1]]
} &, Range[1, iterations]];

{times1, times2} = Transpose[times];

PrintStats[times_] :=

Print["Sum: ", Fold[Plus, 0., times], " Min: ", Min[times],
" Max: ", Max[times], " Mean: ", Mean[times], " StdDev: ",
StandardDeviation[times]]

PrintStats[times1];
ListPlot[times1, PlotRange -> All]

Histogram[times1]

PrintStats[times2];
ListPlot[times2, PlotRange -> All]
Histogram[times2]

Results:


enter image description here



Answer



First off, Timing isn't as accurate as AbsoluteTiming because it has a tendency to ignore various things. Here is a paticularly telling example. Keep in mind that neither will keep track of rendering time or formatting of output, this is purely time spent computing in the kernel.



AbsoluteTiming[x = Accumulate[Range[10^6]]; Pause[x[[1]]]; resA = x + 3;]

==> {1.045213, Null}

Timing[x = Accumulate[Range[10^6]]; Pause[x[[1]]]; resB = x + 3;]

==> {0.031200, Null}

These are identical calculations but Timing ignores Pause so it is way off.


Now lets set up a toy example. Your tests for timings are what I would typically do first when looking for efficiency.



f[x_Integer?Positive] := Accumulate[Range[x]]

g[x_Integer?Positive] :=
Block[{result = Array[0, x]},
result[[1]] = 1;
For[i = 2, i <= x, i++, result[[i]] = result[[i - 1]] + i];
result
]

The AbsoluteTiming is quite different for these two approaches. Clearly the built in function is preferable in this case.



AbsoluteTiming[resf = f[10^6];]

==> {0.015600, Null}

AbsoluteTiming[resg = g[10^6];]

==> {3.432044, Null}

And of course, we should test that these produce equivalent results..


resf == resg


==> True

Now I will mention that there are times when Equal will return False. This may be acceptable in some situations if say we are only really interested in very low precision, ball-park results.


As for memory consumption, I hope someone else might elaborate on this part. One way to test it is with MemoryInUse.


m1 = MemoryInUse[];
f[10^6];
MemoryInUse[] - m1

==> 8001424


m1 = MemoryInUse[];
g[10^6];
MemoryInUse[] - m1

==> 24000656

Again, the system function wins hands down.


Edit:


The reason the second method showed such a substantial increase in MemoryInUse is because it doesn't produce a packed array. If we pack the output, it uses the same memory as the first. This tells me that MemoryInUse only tells us how much memory the result uses and nothing about the amount of memory used in intermediate computations.



m1 = MemoryInUse[];
Developer`ToPackedArray@g[10^6];
MemoryInUse[] - m1

==> 8001472

Edit 2: Here is a function I put together that I'm sure can be made more effective and efficient. It uses a binary search technique with MemoryConstrained to find the amount of memory requested when evaluating an expression.


SetAttributes[memBinarySearch, HoldFirst]

memBinarySearch[expr_, min_, max_] :=

Block[{med = IntegerPart[(max - min)/2], low = min, high = max,
i = 1},
While[True,
If[MemoryConstrained[expr, med] === $Aborted,
low = med;
,
high = med;
];
med = IntegerPart[low + (high - low)/2];
If[Equal @@ Round[{low, med, high}, 2], Break[]];

];
med
]

Here it is applied to f and g from above...


memBinarySearch[f[10^6], 1, 10^9]

==> 16000295

memBinarySearch[g[10^6], 1, 10^9]


==> 62499999

Note that memBinarySearch is only accurate to 2 bytes. For some reason (probably related to IntegerPart) it doesn't like to find the exact byte count requested.


Comments

Popular posts from this blog

plotting - Filling between two spheres in SphericalPlot3D

Manipulate[ SphericalPlot3D[{1, 2 - n}, {θ, 0, Pi}, {ϕ, 0, 1.5 Pi}, Mesh -> None, PlotPoints -> 15, PlotRange -> {-2.2, 2.2}], {n, 0, 1}] I cant' seem to be able to make a filling between two spheres. I've already tried the obvious Filling -> {1 -> {2}} but Mathematica doesn't seem to like that option. Is there any easy way around this or ... Answer There is no built-in filling in SphericalPlot3D . One option is to use ParametricPlot3D to draw the surfaces between the two shells: Manipulate[ Show[SphericalPlot3D[{1, 2 - n}, {θ, 0, Pi}, {ϕ, 0, 1.5 Pi}, PlotPoints -> 15, PlotRange -> {-2.2, 2.2}], ParametricPlot3D[{ r {Sin[t] Cos[1.5 Pi], Sin[t] Sin[1.5 Pi], Cos[t]}, r {Sin[t] Cos[0 Pi], Sin[t] Sin[0 Pi], Cos[t]}}, {r, 1, 2 - n}, {t, 0, Pi}, PlotStyle -> Yellow, Mesh -> {2, 15}]], {n, 0, 1}]

plotting - Plot 4D data with color as 4th dimension

I have a list of 4D data (x position, y position, amplitude, wavelength). I want to plot x, y, and amplitude on a 3D plot and have the color of the points correspond to the wavelength. I have seen many examples using functions to define color but my wavelength cannot be expressed by an analytic function. Is there a simple way to do this? Answer Here a another possible way to visualize 4D data: data = Flatten[Table[{x, y, x^2 + y^2, Sin[x - y]}, {x, -Pi, Pi,Pi/10}, {y,-Pi,Pi, Pi/10}], 1]; You can use the function Point along with VertexColors . Now the points are places using the first three elements and the color is determined by the fourth. In this case I used Hue, but you can use whatever you prefer. Graphics3D[ Point[data[[All, 1 ;; 3]], VertexColors -> Hue /@ data[[All, 4]]], Axes -> True, BoxRatios -> {1, 1, 1/GoldenRatio}]

plotting - Mathematica: 3D plot based on combined 2D graphs

I have several sigmoidal fits to 3 different datasets, with mean fit predictions plus the 95% confidence limits (not symmetrical around the mean) and the actual data. I would now like to show these different 2D plots projected in 3D as in but then using proper perspective. In the link here they give some solutions to combine the plots using isometric perspective, but I would like to use proper 3 point perspective. Any thoughts? Also any way to show the mean points per time point for each series plus or minus the standard error on the mean would be cool too, either using points+vertical bars, or using spheres plus tubes. Below are some test data and the fit function I am using. Note that I am working on a logit(proportion) scale and that the final vertical scale is Log10(percentage). (* some test data *) data = Table[Null, {i, 4}]; data[[1]] = {{1, -5.8}, {2, -5.4}, {3, -0.8}, {4, -0.2}, {5, 4.6}, {1, -6.4}, {2, -5.6}, {3, -0.7}, {4, 0.04}, {5, 1.0}, {1, -6.8}, {2, -4.7}, {3, -1.