On a number of client sites, I use HyperDB or SharDB to spread a WordPress Multisite installation across multiple databases on a single server. However, in my local dev environments, it’s annoying to have thousands of databases. So I use the following technique to create a copy of the remote site that operates in a single database locally.
- Use
mysqldump
to get a backup file. The following command ensures that you don’t pull in information_schema or any other unrelated databases; you can add other DBs to ignore to the NOT IN list:$ mysql -u [username] -p -B -N -e "SELECT SCHEMA_NAME FROM information_schema.SCHEMATA WHERE SCHEMA_NAME NOT IN ('mysql','tmp','innodb','information_schema','performance_schema')" | xargs mysqldump -u [username] -p --add-drop-table --skip-lock-tables --quick --extended-insert --result-file=[path/to/your/dumpfile.sql] --databases
- Use
sed
to remove all the ‘CREATE DATABASE’ and ‘USE’ lines in the dumpfile. This prevents the multiple databases from being created when importing locally.$ sed -i '' -e'/^CREATE DATABASE /d' /path/to/dumpfile.sql $ sed -i '' -e'/^USE /d' /path/to/dumpfile.sql
- Get the dumpfile to your local machine, and import:
$ mysql -u [username] -p -e "create database foo" $ mysql -u [username] -p foo < ~/path/to/local/dumpfile.sql
(or whatever technique you use for mysql imports) (don't know why my code formatter is converting
<
to<
but you get the idea).