Skip to main content

parallelization - Effective parallel processing of large items


When trying to parallelize a task which involves items of larger sizes, I cannot achieve efficient parallelization. To demonstrate this, we launch some kernels:


LaunchKernels[12];


I am facing this problem with real-world data which I cannot share here. Instead, we generate some random data and adjust the following parameters to a) reflect the problem and b) get some reasonable timings:


itemsize = 20000; (* The size of an individual item *)
numberofitems = 200; (* Number of items to process *)
difficulty = 500;(* Processing diffculty: the part of an individual item that is actually processed *)

Now let's generate the random values:


randomValues = Parallelize@Table[
Transpose@{RandomReal[1, itemsize], RandomReal[1, itemsize]},
{numberofitems}
];


An individual item has 320 kB, the full dataset is 64 MB in size:


ByteCount /@ {randomValues[[1]], randomValues}
(* {320152, 64032072} *)

Now we compare Map and ParallelMap with an arbitrary function that takes a reasonable amount of time to process, like FindCurvePath.


map = Map[
FindCurvePath[#[[1 ;; difficulty]]] &,
randomValues
]; // AbsoluteTiming


(* {11.9619, Null} *)

pmap = ParallelMap[
FindCurvePath[#[[1 ;; difficulty]]] &,
randomValues,
Method -> "ItemsPerEvaluation" -> 10
]; // AbsoluteTiming

(* {23.6492, Null} *)


Surprisingly, the parallel version is twice as slow. When watching the CPU usage of the main kernel vs. the subkernels, it is notable that most of the evaluation time is spent at the beginning in the main kernel. Then, processing in the subkernels is done in less than 2 seconds.


Note that I intentionally made the items larger than what is actually processed, so the full items of 320 kB in size (20 000 * 20 000 random reals) need to be distributed to the subkernels. If we reduce the item size to the amount that is actually processed, things change drastically:


pmap2 = ParallelMap[
FindCurvePath[#[[1 ;; difficulty]]] &,
randomValues[[All, 1 ;; difficulty]], (* only use a small part of the items *)
Method -> "ItemsPerEvaluation" -> 10
]; // AbsoluteTiming

(* {2.03152, Null} *)


Now we get a performance improvement as expected. The result is the same:


map === pmap === pmap2
(* True *)

Apparently, the distribution of the large items to the subkernels is the bottleneck. Note that unlike in this demonstration, my real-world application does need all the data that is present in the items.


I did not find any way to improve the parallel performance. Changing the method to FinestGrained or CorsestGrained performs worse. Any ideas how to make parallel processing efficient?



Answer



Seralization of the data to a ByteArray object seems to overcome the data transfer bottleneck. The necessary functions BinarySerialize and BinaryDeserialize have been introduced in 11.1.


Here is a simple function implementing a ParallelMap which serializes the data before the transfer to the subkernels and makes the subkernels deseralize it before processing:



ParallelMapSerialized[f_, data_, opts___] := ParallelMap[
f[BinaryDeserialize@#] &,
BinarySerialize /@ data,
opts
]

Running the benchmark again:


map = Map[
FindCurvePath[#[[1 ;; difficulty]]] &,
randomValues

]; // AbsoluteTiming

(* {9.60715, Null} *)

pmap = ParallelMap[
FindCurvePath[#[[1 ;; difficulty]]] &,
randomValues,
Method -> "ItemsPerEvaluation" -> 10
]; // AbsoluteTiming


(* {17.5937, Null} *)

pmapserialized = ParallelMapSerialized[
FindCurvePath[#[[1 ;; difficulty]]] &,
randomValues,
Method -> "ItemsPerEvaluation" -> 10
]; // AbsoluteTiming

(* {1.85387, Null} *)


pmap === pmap2 === pmapserialized
(* True *)

Serialization led to a performance increase of almost 10-fold compared to ParallelMap, and to a 5-fold increase compared to serial processing.


Comments

Popular posts from this blog

functions - Get leading series expansion term?

Given a function f[x] , I would like to have a function leadingSeries that returns just the leading term in the series around x=0 . For example: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x)] x and leadingSeries[(1/x + 2 + (1 - 1/x^3)/4)/(4 + x)] -(1/(16 x^3)) Is there such a function in Mathematica? Or maybe one can implement it efficiently? EDIT I finally went with the following implementation, based on Carl Woll 's answer: lds[ex_,x_]:=( (ex/.x->(x+O[x]^2))/.SeriesData[U_,Z_,L_List,Mi_,Ma_,De_]:>SeriesData[U,Z,{L[[1]]},Mi,Mi+1,De]//Quiet//Normal) The advantage is, that this one also properly works with functions whose leading term is a constant: lds[Exp[x],x] 1 Answer Update 1 Updated to eliminate SeriesData and to not return additional terms Perhaps you could use: leadingSeries[expr_, x_] := Normal[expr /. x->(x+O[x]^2) /. a_List :> Take[a, 1]] Then for your examples: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x), x] leadingSeries[Exp[x], x] leadingSeries[(1/x + 2 + (1 - 1/x...

mathematical optimization - Minimizing using indices, error: Part::pkspec1: The expression cannot be used as a part specification

I want to use Minimize where the variables to minimize are indices pointing into an array. Here a MWE that hopefully shows what my problem is. vars = u@# & /@ Range[3]; cons = Flatten@ { Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; Minimize[{Total@((vec1[[#]] - vec2[[u[#]]])^2 & /@ Range[1, 3]), cons}, vars, Integers] The error I get: Part::pkspec1: The expression u[1] cannot be used as a part specification. >> Answer Ok, it seems that one can get around Mathematica trying to evaluate vec2[[u[1]]] too early by using the function Indexed[vec2,u[1]] . The working MWE would then look like the following: vars = u@# & /@ Range[3]; cons = Flatten@{ Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; NMinimize[ {Total@((vec1[[#]] - Indexed[vec2, u[#]])^2 & /@ R...

plotting - Plot 4D data with color as 4th dimension

I have a list of 4D data (x position, y position, amplitude, wavelength). I want to plot x, y, and amplitude on a 3D plot and have the color of the points correspond to the wavelength. I have seen many examples using functions to define color but my wavelength cannot be expressed by an analytic function. Is there a simple way to do this? Answer Here a another possible way to visualize 4D data: data = Flatten[Table[{x, y, x^2 + y^2, Sin[x - y]}, {x, -Pi, Pi,Pi/10}, {y,-Pi,Pi, Pi/10}], 1]; You can use the function Point along with VertexColors . Now the points are places using the first three elements and the color is determined by the fourth. In this case I used Hue, but you can use whatever you prefer. Graphics3D[ Point[data[[All, 1 ;; 3]], VertexColors -> Hue /@ data[[All, 4]]], Axes -> True, BoxRatios -> {1, 1, 1/GoldenRatio}]