Skip to main content

functions - Why is the new PositionIndex horribly slow?


This issue has largely been mitigated in 10.0.1. New timings for the final test below are:


Needs["GeneralUtilities`"]
a = RandomInteger[9, 5*^5];
myPosIdx[a] // AccurateTiming
cleanPosIdx[a] // AccurateTiming (* see self-answer below *)

PositionIndex[a] // AccurateTiming


0.0149384

0.0149554

0.0545865

Still several times slower here than the readily available alternatives but no longer devastating.





Disconcertingly I have discovered that the new (v10) PositionIndex is horribly slow.


Using Szabolcs's clever GatherBy inversion we can implement our own function for comparison:


myPosIdx[x_] :=
<|Thread[x[[ #[[All, 1]] ]] -> #]|> & @ GatherBy[Range @ Length @ x, x[[#]] &]

Check that its output matches:


RandomChoice[{"a", "b", "c"}, 50];

myPosIdx[%] === PositionIndex[%]



True

Check performance in version 10.0.0 under Windows:


a = RandomInteger[99999, 5*^5];
myPosIdx[a] // Timing // First
PositionIndex[a] // Timing // First



0.140401

0.920406

Not a good start for the System` function, is it? It gets worse:


a = RandomInteger[999, 5*^5];
myPosIdx[a] // Timing // First
PositionIndex[a] // Timing // First



0.031200

2.230814

With fewer unique elements PositionIndex actually gets slower! Does the trend continue?


a = RandomInteger[99, 5*^5];
myPosIdx[a] // Timing // First
PositionIndex[a] // Timing // First



0.015600

15.958902

Somewhere someone should be doing a face-palm right about now. Just how bad does it get?


a = RandomInteger[9, 5*^5];
myPosIdx[a] // Timing // First
PositionIndex[a] // Timing // First



0.015600

157.295808

Ouch. This has to be a new record for poor computational complexity in a System function. :o



Answer



First let me note that I didn't write PositionIndex, so I can't speak to its internals without doing a bit of digging (which at the moment I do not have time to do).


I agree performance could be improved in the case where there are many collisions. Let's quantify how bad the situation is, especially since complexity was mentioned!


We'll use the benchmarking tool in GeneralUtilities to plot time as a function of the size of the list:


Needs["GeneralUtilities`"]

myPosIdx[x_] := <|Thread[x[[#[[All, 1]]]] -> #]|> &@
GatherBy[Range@Length@x, x[[#]] &];
BenchmarkPlot[{PositionIndex, myPosIdx}, RandomInteger[100, #] &, 16, "IncludeFits" -> True]

which gives:


PositionIndex benchmark


While PositionIndex wins for small lists (< 100 elements), it is substantially slower for large lists. It does still appear to be $O(n \log n)$, at least.


Let's choose a much larger random integer (1000000), so that we don't have any collisions:


enter image description here


Things are much better here. We can see that collisions are the main culprit.



Now lets see how the speed for a fixed-size list depends on the number of unique elements:


BenchmarkPlot[{PositionIndex, myPosIdx}, RandomInteger[#, 10^4] &, 
2^{3, 4, 5, 6, 7, 8, 9, 10, 11, 12}]

enter image description here


Indeed, we can see that PositionIndex (roughly) gets faster as there are more and more unique elements, whereas myPosIdx gets slower. That makes sense, because PositionIndex is probably appending elements to each value in the association, and the fewer collisions the fewer (slow) appends will happen. Whereas myPosIdx is being bottlenecked by the cost of creating each equivalence class (which PositionIndex would no doubt be too, if it were faster). But this is all academic: PositionIndex should be strictly faster than myPosIdx, it is written in C.


We will fix this.


Comments

Popular posts from this blog

plotting - Plot 4D data with color as 4th dimension

I have a list of 4D data (x position, y position, amplitude, wavelength). I want to plot x, y, and amplitude on a 3D plot and have the color of the points correspond to the wavelength. I have seen many examples using functions to define color but my wavelength cannot be expressed by an analytic function. Is there a simple way to do this? Answer Here a another possible way to visualize 4D data: data = Flatten[Table[{x, y, x^2 + y^2, Sin[x - y]}, {x, -Pi, Pi,Pi/10}, {y,-Pi,Pi, Pi/10}], 1]; You can use the function Point along with VertexColors . Now the points are places using the first three elements and the color is determined by the fourth. In this case I used Hue, but you can use whatever you prefer. Graphics3D[ Point[data[[All, 1 ;; 3]], VertexColors -> Hue /@ data[[All, 4]]], Axes -> True, BoxRatios -> {1, 1, 1/GoldenRatio}]

plotting - Filling between two spheres in SphericalPlot3D

Manipulate[ SphericalPlot3D[{1, 2 - n}, {θ, 0, Pi}, {ϕ, 0, 1.5 Pi}, Mesh -> None, PlotPoints -> 15, PlotRange -> {-2.2, 2.2}], {n, 0, 1}] I cant' seem to be able to make a filling between two spheres. I've already tried the obvious Filling -> {1 -> {2}} but Mathematica doesn't seem to like that option. Is there any easy way around this or ... Answer There is no built-in filling in SphericalPlot3D . One option is to use ParametricPlot3D to draw the surfaces between the two shells: Manipulate[ Show[SphericalPlot3D[{1, 2 - n}, {θ, 0, Pi}, {ϕ, 0, 1.5 Pi}, PlotPoints -> 15, PlotRange -> {-2.2, 2.2}], ParametricPlot3D[{ r {Sin[t] Cos[1.5 Pi], Sin[t] Sin[1.5 Pi], Cos[t]}, r {Sin[t] Cos[0 Pi], Sin[t] Sin[0 Pi], Cos[t]}}, {r, 1, 2 - n}, {t, 0, Pi}, PlotStyle -> Yellow, Mesh -> {2, 15}]], {n, 0, 1}]

plotting - Mathematica: 3D plot based on combined 2D graphs

I have several sigmoidal fits to 3 different datasets, with mean fit predictions plus the 95% confidence limits (not symmetrical around the mean) and the actual data. I would now like to show these different 2D plots projected in 3D as in but then using proper perspective. In the link here they give some solutions to combine the plots using isometric perspective, but I would like to use proper 3 point perspective. Any thoughts? Also any way to show the mean points per time point for each series plus or minus the standard error on the mean would be cool too, either using points+vertical bars, or using spheres plus tubes. Below are some test data and the fit function I am using. Note that I am working on a logit(proportion) scale and that the final vertical scale is Log10(percentage). (* some test data *) data = Table[Null, {i, 4}]; data[[1]] = {{1, -5.8}, {2, -5.4}, {3, -0.8}, {4, -0.2}, {5, 4.6}, {1, -6.4}, {2, -5.6}, {3, -0.7}, {4, 0.04}, {5, 1.0}, {1, -6.8}, {2, -4.7}, {3, -1....