Skip to main content

Why is there a huge performance gap using Map with more than 100 List entries



im using Map on a List like this:


cube= {{1, 1, 1}, {1, 1, 2}, {1, 1, 3}, {1, 1, 4}, ... , {5, 5, 4}, {5, 5, 5}}

Mapping the whole List with 125 entries takes like 2.5s.


AbsoluteTiming[
Map[Apply[d[[#1, #2, #3]] &, #] &, cube];
]

{2.552146, Null}


Mapping in two sublists with less than 100 entries the whole thing takes nearly no time.


 AbsoluteTiming[
Join[
Map[Apply[d[[#1, #2, #3]] &, #] &, cube[[1 ;; 99]]],
Map[Apply[d[[#1, #2, #3]] &, #] &, cube[[100 ;; 125]]]
];

]

{0., Null}


Why is there a huge performance gap? An how do I avoid it except splitting my list?



Answer



If you look at SystemOptions[], like so,


Column[
OpenerView /@
(Replace[SystemOptions[], Rule[x_, y_] -> List[x, y],
1])
]


you see that under CompileOptions, if you click on the triangle to open it,


enter image description here


there is an option "MapCompileLength" -> 100. Set it to eg 10 and see it it helps (do SetSystemOptions["CompileOptions" -> {"MapCompileLength" -> 10}]).


This option determines the length of the list above which Mathematica (tries to) compile the function to be mapped.


EDIT: Example:


Here's some data:


Length[cube = Tuples[Range[10], 4]]

And here's a function which is a) inefficient on purpose, b) designed to be compilable as-is (that's why I localise s, so that Compile will work).


d = (Module[{s = 0}, Do[s = s + #[[i]]^2, {i, Length@#}];s] &)


Now, set the auto-compilation length for Map to 100 (the default):


SetSystemOptions["CompileOptions" -> {"MapCompileLength" -> 100}]

and now test:


Needs["GeneralUtilities`"]
Quiet@BenchmarkPlot[d /@ # &, cube[[1 ;; #]] &, Range[90, 110]]

enter image description here


Comments

Popular posts from this blog

plotting - Plot 4D data with color as 4th dimension

I have a list of 4D data (x position, y position, amplitude, wavelength). I want to plot x, y, and amplitude on a 3D plot and have the color of the points correspond to the wavelength. I have seen many examples using functions to define color but my wavelength cannot be expressed by an analytic function. Is there a simple way to do this? Answer Here a another possible way to visualize 4D data: data = Flatten[Table[{x, y, x^2 + y^2, Sin[x - y]}, {x, -Pi, Pi,Pi/10}, {y,-Pi,Pi, Pi/10}], 1]; You can use the function Point along with VertexColors . Now the points are places using the first three elements and the color is determined by the fourth. In this case I used Hue, but you can use whatever you prefer. Graphics3D[ Point[data[[All, 1 ;; 3]], VertexColors -> Hue /@ data[[All, 4]]], Axes -> True, BoxRatios -> {1, 1, 1/GoldenRatio}]

plotting - Filling between two spheres in SphericalPlot3D

Manipulate[ SphericalPlot3D[{1, 2 - n}, {θ, 0, Pi}, {ϕ, 0, 1.5 Pi}, Mesh -> None, PlotPoints -> 15, PlotRange -> {-2.2, 2.2}], {n, 0, 1}] I cant' seem to be able to make a filling between two spheres. I've already tried the obvious Filling -> {1 -> {2}} but Mathematica doesn't seem to like that option. Is there any easy way around this or ... Answer There is no built-in filling in SphericalPlot3D . One option is to use ParametricPlot3D to draw the surfaces between the two shells: Manipulate[ Show[SphericalPlot3D[{1, 2 - n}, {θ, 0, Pi}, {ϕ, 0, 1.5 Pi}, PlotPoints -> 15, PlotRange -> {-2.2, 2.2}], ParametricPlot3D[{ r {Sin[t] Cos[1.5 Pi], Sin[t] Sin[1.5 Pi], Cos[t]}, r {Sin[t] Cos[0 Pi], Sin[t] Sin[0 Pi], Cos[t]}}, {r, 1, 2 - n}, {t, 0, Pi}, PlotStyle -> Yellow, Mesh -> {2, 15}]], {n, 0, 1}]

plotting - Mathematica: 3D plot based on combined 2D graphs

I have several sigmoidal fits to 3 different datasets, with mean fit predictions plus the 95% confidence limits (not symmetrical around the mean) and the actual data. I would now like to show these different 2D plots projected in 3D as in but then using proper perspective. In the link here they give some solutions to combine the plots using isometric perspective, but I would like to use proper 3 point perspective. Any thoughts? Also any way to show the mean points per time point for each series plus or minus the standard error on the mean would be cool too, either using points+vertical bars, or using spheres plus tubes. Below are some test data and the fit function I am using. Note that I am working on a logit(proportion) scale and that the final vertical scale is Log10(percentage). (* some test data *) data = Table[Null, {i, 4}]; data[[1]] = {{1, -5.8}, {2, -5.4}, {3, -0.8}, {4, -0.2}, {5, 4.6}, {1, -6.4}, {2, -5.6}, {3, -0.7}, {4, 0.04}, {5, 1.0}, {1, -6.8}, {2, -4.7}, {3, -1....