I just started to work with a database through Mathematica. I have a large data set. I use an .mx
file to store my data. Now I have to store the data in a database, I have installed Microsoft SQL server 2012 in my system. My problem is I don't know how I should store my big list of data in the database. My options seem to be
store the mx file in database and retrieve it using SQL command in Mathematica
create a column in a database table and insert an element from my data list into each row.
I tried the SQL route to insert the .mx
file into the database, since I thought that would be better than inserting the list element-by-element.
insert into test select * from openrowset(bulk 'I:\MsSql\test.mx', single_blob) as test1
I executed the above line in Microsoft SQL server 2012. It appeared to work, but when I tried to retrieve my data set from the database table through Mathematica, I got results that look like a big sublist containing some random numbers and characters.
{{282A546869732069732061......}}
I have googled it, but I couldn't find a useful answer. Please explain what I did wrong or refer to me to good material I can learn from.
Answer
I would rule out option #1 as it would be like working inside a spreadsheet but using only cell A1.
For inserting large amounts of data I recommend you to just use DatabaseLink's SQLExecute
.
As your dataset is large, and you want to insert this as fast as possible please take into account that there are very large differences in performance depending on how you do this.
Lets see some examples. Start by creating some sample data:
data = RandomInteger[100, {20000, 1}];
So data
contains 20000 random integers. Lets assume that your database connection is in the variable conn
and that there is a table myTable
in your database with a single column which can hold integer numbers.
Here we see that a very fast operation repeated 20k times can take a long time:
Scan[SQLExecute[conn, "INSERT INTO myTable VALUES (`1`) ", {#}] &,
Flatten[data]] // Timing
{323.609754, Null}
Instead you can send all of the numbers at the same time to DatabaseLink for insertion which results in a much better performance:
SQLExecute[conn, "INSERT INTO myTable VALUES (`1`) ", data]; // Timing
{3.961870, Null}
We can also try to send the numbers in batches of 100 numbers
Scan[SQLExecute[conn, "INSERT INTO myTable VALUES (`1`) ", #] &,
Partition[data, 100, 100, 1, {}]] // Timing
{7.210615, Null}
And finally we can build a huge query string:
query = "INSERT INTO myTable VALUES " <>
StringReplace[
StringTake[ToString@data, {2, -2}], {"{" -> "(",
"}" -> ")"}]; // Timing
{0.039831, Null}
And execute it at a blazing fast speed:
SQLExecute[conn, query]; // Timing
{0.025865, Null}
This last approach must have a downside. I guess it uses more RAM.
Comments
Post a Comment