Skip to main content

machine learning - How to use Mathematica to train a network Using out of core classification?


I see there is doc about how to train a network Using out of core image classification and this question.But the object is only image.


I want to use a binary file as data(Sequence to Sequence case),for example like this.


data = Flatten@Table[{x, y} -> x*y, {x, -1, 1, .05}, {y, -1, 1, .05}];
mydata = Flatten[data /. {(a_ -> b_) -> {a, b}}];
BinaryWrite[file, mydata, "Real32", ByteOrdering -> -1];
Close[file];

Length of data:1681


The data looks like this:



enter image description here


Usually,the size of data is very large,so it is only a example.


I use this code:


fileName = "C:\\Users\\xiaoz\\Downloads\\test_data_SE.dat"; 
file = OpenRead[fileName, BinaryFormat -> True];
net = NetChain[{32, Tanh, 1}, "Input" -> 2, "Output" -> "Scalar"];
size = FileByteCount[fileName];

read[file_, batchSize_] := If[StreamPosition[file] +
batchSize*3(*length of data in one batch*)*4(*float data*)> size,

SetStreamPosition[file, 0]; BinaryReadList[file, "Real32", batchSize*3],
BinaryReadList[file, "Real32", batchSize*3]];

batchSize = 128;
Do[data = read[file, batchSize];
trainingData = #[[1 ;; 2]] -> #[[3]] & /@ Partition[data, 3];
net = NetTrain[net, trainingData, BatchSize -> batchSize,
MaxTrainingRounds -> 1,TrainingProgressReporting -> None], {200}]

ContourPlot[net[{x, y}], {x, -1, 1}, {y, -1, 1},

ColorFunction -> "RedGreenSplit", PlotLegends -> Automatic]
Close[file]

enter image description here


You can see,it is slow and the result is not perfect with enter image description here


So how to use Mathematica to train a network Using out of core classification?


Releated: I use TensorFlow can handle this Using Queue and Multi-thread: What's going on in tf.train.shuffle_batch and `tf.train.batch?


And the wolfram blog says:



Another thing that’s being introduced as an experiment in Version 11.3 is the MongoLink package, which supports connection to external MongoDB databases. We use MongoLink ourselves to manage terabyte-and-beyond datasets for things like machine learning training. And in fact MongoLink is part of our large-scale development effort—whose results will be seen in future versions—to seamlessly support extremely large amounts of externally stored data.





Answer



Okay here's how you do out-of-core training with HDF5:


input = RandomReal[1, {1000, 2}];
output = RandomReal[1, {1000, 2}];

Get["GeneralUtilities`"];
ExportStructuredHDF5["test.h5", <|"Input" -> input,
"Output" -> output|>]


NetTrain[LinearLayer["Input" -> 2, "Output" -> 2], File["test.h5"]]

The use of ExportStructuredHDF5 is just for convenience, you could also Export but it doesn't support associations directly. But again you'll need to make a dataset that consists of extendible columns if you want a real-world out-of-core example.


Also important to note is that you need to randomize the order of data yourself before writing it to the H5 file.


Comments

Popular posts from this blog

mathematical optimization - Minimizing using indices, error: Part::pkspec1: The expression cannot be used as a part specification

I want to use Minimize where the variables to minimize are indices pointing into an array. Here a MWE that hopefully shows what my problem is. vars = u@# & /@ Range[3]; cons = Flatten@ { Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; Minimize[{Total@((vec1[[#]] - vec2[[u[#]]])^2 & /@ Range[1, 3]), cons}, vars, Integers] The error I get: Part::pkspec1: The expression u[1] cannot be used as a part specification. >> Answer Ok, it seems that one can get around Mathematica trying to evaluate vec2[[u[1]]] too early by using the function Indexed[vec2,u[1]] . The working MWE would then look like the following: vars = u@# & /@ Range[3]; cons = Flatten@{ Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; NMinimize[ {Total@((vec1[[#]] - Indexed[vec2, u[#]])^2 & /@ R...

functions - Get leading series expansion term?

Given a function f[x] , I would like to have a function leadingSeries that returns just the leading term in the series around x=0 . For example: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x)] x and leadingSeries[(1/x + 2 + (1 - 1/x^3)/4)/(4 + x)] -(1/(16 x^3)) Is there such a function in Mathematica? Or maybe one can implement it efficiently? EDIT I finally went with the following implementation, based on Carl Woll 's answer: lds[ex_,x_]:=( (ex/.x->(x+O[x]^2))/.SeriesData[U_,Z_,L_List,Mi_,Ma_,De_]:>SeriesData[U,Z,{L[[1]]},Mi,Mi+1,De]//Quiet//Normal) The advantage is, that this one also properly works with functions whose leading term is a constant: lds[Exp[x],x] 1 Answer Update 1 Updated to eliminate SeriesData and to not return additional terms Perhaps you could use: leadingSeries[expr_, x_] := Normal[expr /. x->(x+O[x]^2) /. a_List :> Take[a, 1]] Then for your examples: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x), x] leadingSeries[Exp[x], x] leadingSeries[(1/x + 2 + (1 - 1/x...

plotting - Plot 4D data with color as 4th dimension

I have a list of 4D data (x position, y position, amplitude, wavelength). I want to plot x, y, and amplitude on a 3D plot and have the color of the points correspond to the wavelength. I have seen many examples using functions to define color but my wavelength cannot be expressed by an analytic function. Is there a simple way to do this? Answer Here a another possible way to visualize 4D data: data = Flatten[Table[{x, y, x^2 + y^2, Sin[x - y]}, {x, -Pi, Pi,Pi/10}, {y,-Pi,Pi, Pi/10}], 1]; You can use the function Point along with VertexColors . Now the points are places using the first three elements and the color is determined by the fourth. In this case I used Hue, but you can use whatever you prefer. Graphics3D[ Point[data[[All, 1 ;; 3]], VertexColors -> Hue /@ data[[All, 4]]], Axes -> True, BoxRatios -> {1, 1, 1/GoldenRatio}]