istanbul escort - beylikdüzü escort
Home mysql sql file splitter to the rescue
formats

sql file splitter to the rescue

Published on May 31, 2008 by in mysql, PHP

My Adventures in sql dumping and database migration

My latest problem project presented itself as a phpbb conversion. The client was having some issues with the old versions of phpbb so we elected to move them to the new version. While this is a great idea on the surface I did not realize how big the database was. I started out doing an export using phpMyAdmin not knowing the file sizes I was dealing with. When that failed I decided to do a mysqldump from the shell. Imagine the look of shock and dumbfoundedness on my face when I finished the dump to discover a 300 meg + sql file.

Why is this a problem? Well most ISPs have upload limits usually under 20meg for import uploads using phpMyAdmin. My default install of Xampp limit is 16 meg. I decided to try to gzip the file with a 100meg + result. At least now I could download the file to my local machine to try to figure out what to do.

First Trial PhpMyAdmin Local File upload

I edited my php.ini files and change the file_upload, and max_execution time to try and enable a way to upload the file.

  1. max_execution_time = 560
  2. max_input_time = 160
  3. memory_limit = 60M
  4. upload_max_filesize = 160M

After an hour of battling the settings to try and get one that would work, I had to read some phpMyadmin documentation and see if I could find a solution.

PhpMyAdmin Config Change
My first config setting change was a nice config setting called $cfg['UploadDir']. With this option set to a local server folder, you can select a direct data upload from a local drive. When you open phpMyAdmin it will give you a list of files to select from in the folder. You can then select the file and it will perform a direct import of the larger than limit sql file. However, this still didn't solve the timeout problems with a 100meg sql file. I had to look further.

SQL file splitter

After kicking around some work a rounds, I realized the only way to beat this, was to split the file into parts. I can either do this by hand, or find a solution.

Google is a wonderful tool for finding things and it didn't let me down this time either.

I found a blog the outlined exactly what I was looking for: Sql file splitter From reviewing his blog It appeared he had the exact same issues I had. The solution was a program called SQLDumpSplitter 2, by Philip Lehmann-Böhm (http://www.philiplb.de). The main links on the blog no longer worked, but a dutch website had left a blog comment with a direct link to the file. You can download from here: Sqldumpsplitter2 Hopefully that file link will stick around for a while.

I split the 300 meg file into 23 16-meg chunks. This worked very well, I was able to import the large 300 meg file, but it still took baby sitting and remembering what the last file I used was. In fact I spent several hours importing and then still experienced some time out issues.

I then decided to see if I could find another solution, so it was back to google.

php mysql large file importer

was the keyword search I did. What I found was this blog: How to import large mysql data files. This WAS the solution to all my problems!!! In this blog, there was a link to a site and a program called Bigdump.

Bigdump is a staggered mysql dump importer. What this means, is you start the file, and it will automatically import the large file till it is complete. That means, no file splitting. It is truly an amazing piece of 10K of code.

To import a large file you perform the following steps:

  1. 1 - Copy bigdump.php to the website where you want to import the file.
  2. 2 - If the file size is larger that the upload limit of the server, copy the
  3. file to the same foldere.
  4. 3 - Edit the bigdump.php file, set the database name, the username and password.
  5. 4 - If the file size is larger than the limit, enter the filename into the
  6. "filename" field.
  7. 5 - Execute the bigdump file by going to http://yourdomainlocation/bigdump.com
  8. 6 - Wait for the import to complete and be prepared for the shock and awe.

For my test I used the uncompressed 300 meg sql file. The documentation states, you can use the gzipped version also. It took approx 15 minutes or less, to import that file with the bigdump program. I still am amazed. Had I found this earlier today, this blog may not actually exist.

Hurrah for Bigdump!!!! I gladly sent a paypal donation for this code! It saved my life.

I hope that the users that are looking for a sql splitter will find this blog and save themselves the initial headaches I had.


 
 Share on Facebook Share on Twitter Share on Reddit Share on LinkedIn
No Comments  comments 
© Brainstorms of a Webdev
credit