且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

将大型MySQL表导出为多个较小的文件

更新时间:2023-01-31 09:05:21

我刚刚做了一个导入/导出(分区)表,拥有5000万条记录,只需2分钟即可将其从相当快的机器中导出,15分钟就可以将其导入到较慢的桌面上。没有必要拆分文件。



mysqldump是你的朋友,知道你有很多数据,***压缩它

  @ host1:〜$ mysqldump -u< username> -p< database> &LT;表> | gzip> output.sql.gz 
@ host1:〜$ scp output.sql.gz host2:〜/
@ host1:〜$ rm output.sql.gz
@ host1:〜$ ssh host2
@ host2:〜$ gunzip< output.sql.gz | mysql -u< username> -p< database>
@ host2:〜$ rm output.sql.gz


I have a very large MySQL table on my local dev server: over 8 million rows of data. I loaded the table successfully using LOAD DATA INFILE.

I now wish to export this data and import it onto a remote host.

I tried LOAD DATA LOCAL INFILE to the remote host. However, after around 15 minutes the connection to the remote host fails. I think that the only solution is for me to export the data into a number of smaller files.

The tools at my disposal are PhpMyAdmin, HeidiSQL and MySQL Workbench.

I know how to export as a single file, but not multiple files. How can I do this?

I just did an import/export of a (partitioned) table with 50 millions record, it needed just 2 minutes to export it from a reasonably fast machine and 15 minutes to import it on my slower desktop. There was no need to split the file.

mysqldump is your friend, and knowing that you have a lot of data it's better to compress it

 @host1:~ $ mysqldump -u <username> -p <database> <table> | gzip > output.sql.gz
 @host1:~ $ scp output.sql.gz host2:~/
 @host1:~ $ rm output.sql.gz
 @host1:~ $ ssh host2
 @host2:~ $ gunzip < output.sql.gz | mysql -u <username> -p <database>
 @host2:~ $ rm output.sql.gz