I've got this CSV file I've imported that has tens of millions of lines in it. It takes around 20 minutes to import. I've been working with it for a while and have the processed data spread out in a bunch of variables.
Now Windows is bugging me that I need to restart the computer.
I thought about gathering all the data up in a table and then export and import it, but that would be a lot of hassle and take ages. I also thought about just saving the notebook and re-evaluate it, but with this amount of data that will also take a long time.
I wonder what is the best way to save all the data so that I can get it back after having restarted the computer? Something fast and with minimum of hassle would be great.
PS. I have no idea how to tag this thing. There is apparently no big-data tag.
Answer
Assuming you haven't placed your variables in a non-standard context you can save them all at once using DumpSave
's second syntax form, which saves everything in the indicated context.
Quit[] (* start a fresh kernel *)
x = 1; (* define some symbols *)
y = 2;
z[x_] := x^2
Names["Global`*"] (* Check they're there *)
(* ==> {"x", "y", "z"} *)
(* Save everything in the context *)
DumpSave["C:\\Users\\Sjoerd\\Desktop\\dump.mx", "Global`"];
Quit[] (* kill kernel to simulate a new start *)
Names["Global`*"] (* Are we clean? *)
(* ==> {} *)
(* Get the save symbols *)
<< "C:\\Users\\Sjoerd\\Desktop\\dump.mx"
(* Are they there? *)
Names["Global`*"]
(* ==> {"x", "y", "z"} *)
z[y]
(* ==> 4 *)
Comments
Post a Comment