list manipulation - how to efficiently apply function to all pairs of a set (and collect the results)
To build a graph, I need to apply a function f[a_, b_]
to all pairs of a list (3500 elements). The function itself returns a link {a <-> b}
if a particular relation holds - I collect all the results into a list and use it as input to Graph[]
.
The question is: is there an efficient and elegant way to do this? I've tried two ways: a recursive method and a (similar) iterative method. The former went over the usual recursion limits, the latter was slow and (I believe) not the optimal way to do it.
Both of these performed by applying f[]
to the First[]
element vs. the 2nd, 3rd, ...Last[]
elements, and collecting the results. Then I'd remove the First[]
from the list, and repeat - doing n*(n-1)/2 evaluations of f[]
. This many evaluations is required, but I definitely do not have a clean, functional implementation.
So, in short, if this can be turned into an efficient 1-liner, instead of a loop, please do let me know! Thanks in advance.
Answer
There are two built-in functions to generate pairs, either with (Tuples
) or without (Subsets
) duplication. Since your question states the number of iterations as $n*(n-1)/2$ I believe you want the latter:
set = {1, 2, 3, 4};
Subsets[set, {2}]
{{1, 2}, {1, 3}, {1, 4}, {2, 3}, {2, 4}, {3, 4}}
The short notation for Apply
at level 1 is @@@
, so this gives f
for each pair:
f @@@ Subsets[set, {2}]
{f[1, 2], f[1, 3], f[1, 4], f[2, 3], f[2, 4], f[3, 4]}
This is in my opinion the most elegant code to produce this specific result, and it is quite fast, but it is not memory efficient if you only need to collect a result for a low percentage of the pairs. Let's define f
as follows (I use {}
here generically):
f[a_, b_] /; b~Divisible~a := {a, b}
f[___] = Sequence[];
If we now compute all the pairs for Range[5000]
it takes one gigabyte of memory:
pairs = Range@5000 ~Subsets~ {2};
pairs // ByteCount
1099780032
And applying f
we see that of the nearly 12.5 million pairs we need a return for only 38376 of them which takes only 3MB of storage:
r1 = f @@@ pairs;
ByteCount[r1]
Length[r1]
3377120
38376
Yet, the maximum memory used is 1.6GB:
MaxMemoryUsed[]
1629793536
A simple method to reduce memory consumption is to process the subsets in blocks, rather than all at once, as follows:
set = Range@5000;
n = Length@set;
max = n (n - 1)/2;
block = 10000;
Timing[
r2 =
Join @@ Array[
f @@@ Subsets[set, {2}, block {# - 1, #} + {1, 0}] &,
⌈max/block⌉
];
]
Length[r2]
MaxMemoryUsed[]
{8.299, Null}
38376
19769800
This only uses a maximum of ~20MB of memory, only a few MB over the baseline on my system.
(It issues a Subsets::take
message but there is no error.)
My preferred method
Another method, and the one that I prefer, is to compute the pairs more manually allowing f
to be embedded in the process so as to not generate all pairs beforehand. This uses Outer
to effect the pair generation for each element separately (that is, all pairs starting with a certain element).
pairMap[f_, s_] := Module[{ss = s},
Flatten[Outer[f, {#}, ss = Rest@ss, 1] & /@ Most@s, 2] ]
pairMap[f, Range@5000] // Length // Timing
MaxMemoryUsed[]
{7.816, 38376}
19430512
This also uses only a small amount of memory, and testing should bear out that it is faster as well. A variation of this method that may be even faster is to not build an output expression at all, relying instead on Sow
and Reap
to gather your results:
g[a_, b_] /; b ~Divisible~ a := Sow[ {a, b} ]
pairScan[f_, s_] := Module[{ss = s}, Outer[f, {#}, ss = Rest@ss, 1] & ~Scan~ Most@s ]
Reap[ pairScan[g, Range@5000] ][[2, 1]] // Length // Timing
MaxMemoryUsed[]
{6.583, 38376}
18757552
An argument for Do
loops
While Outer
is somewhat faster, after further consideration and conferring with jVincent I think perhaps after all a Do
loop is as good as anything. One could write pairScan
in this way:
pairScan2[f_, s_] := Module[{ss = s}, Do[f[i, j], {i, ss}, {j, ss = Rest@ss}] ]
Reap[ pairScan2[g, Range@5000] ][[2, 1]] // Length // Timing
MaxMemoryUsed[]
{7.613, 38376}
18711080
Comments
Post a Comment