Large sized mysql dump

Tuesday, 9 December 2008


As mentioned previously I have been working recently converting over some old classic asp pages into php. It has also been a company decision for them to continue to run mssql in conjunction with php. This hasn't caused too many dramas and isn't really what this post is about, but just to give you some background.

Recently my manager supplied me with a CD with a 350+ meg sql dump from mysql and asked me to verify the data. I thought the nicest way to do this would be to have phpmyadmin up and running so that they could quickly browse through the results once the data had been imported. Especially as they are used to using mssql's SQL Server Management Studio.

After trying to import several times through the command line to import the dump back into the new database I had no success. I had tried increasing the max_allowed_packet in the my.ini file, but with no success. I kept getting MySQL Server has gone away errors.

After getting in contact with the person who did the dump I asked a few questions about how they had gone about verifying the dump and if they had any success importing the data back in. It turns out that I was on the right track. In order to get this file to import they had increased the max_allowed_packet size to a wopping 10000000M. I tried this and the import worked straight away.

Here is the entry for the my.ini for those who want to know:
set-variable=max_allowed_packet=10000000M

I suggest switching it back to something more sensible after the import. If you want more idea about what the packet size relates to and how to change settings have a look here:
http://dev.mysql.com/doc/refman/5.0/en/packet-too-large.html
blog comments powered by Disqus