Tag Archives: mysqldump

Manually copy content and settings between sites in a WP network

I just had a request to copy the contents and settings from one site within a WordPress network to another within the same network. (The destination site is the “staging” version of the source.) Daniel Bachhuber’s Dictator along with the general wp-cli export/import tools are the ideal tools for this sort of thing, but due to some odd circumstances I wasn’t able to use them. So here’s a quick rundown of what I ended up doing. (This post mainly for my own records. If any step below is confusing to you, you probably should not be doing it this way. Use at your own risk!)

  • Get exports of the production db tables (as well as staging, for backup). I ended up crafting the following (614 is the ID of the production site):
    [code language=”bash”]mysql -u [username] -p information_schema -B -N -e “SELECT table_name FROM tables WHERE table_name LIKE ‘wp_614_%'” | xargs mysqldump -u [username] -p [database name] –add-drop-table –skip-lock-tables –quick –extended-insert –result-file=[/path/to/dumpfile.sql][/code]
  • I downloaded that dumpfile and imported it into a local database, so that I could run it through https://github.com/interconnectit/Search-Replace-DB to do the necessary URL replacements. (Could’ve used wp-cli, but this way I didn’t need to have a functional local WP installation.)
  • Did a further search and replace to change instances of ‘wp_614_’ to ‘wp_860_’ (the staging site ID)
  • Uploaded that .sql file and imported
  • Next, I had to handle files. Normally this would take 30 seconds at the command line, but permissions were locked down on this server: my SSH user didn’t have proper permissions to modify some of the directories in blogs.dir. So I wrote a quick script that would run the necessary commands in PHP (as the webserver user), implemented as an mu-plugin: https://gist.github.com/boonebgorges/75e3ec70bd5177dab7dd

Again, use at your own risk.

Convert multi-db WordPress mysqldump to single-db

On a number of client sites, I use HyperDB or SharDB to spread a WordPress Multisite installation across multiple databases on a single server. However, in my local dev environments, it’s annoying to have thousands of databases. So I use the following technique to create a copy of the remote site that operates in a single database locally.

  1. Use mysqldump to get a backup file. The following command ensures that you don’t pull in information_schema or any other unrelated databases; you can add other DBs to ignore to the NOT IN list:
    [code language=”bash”]
    $ mysql -u [username] -p -B -N -e “SELECT SCHEMA_NAME FROM information_schema.SCHEMATA WHERE SCHEMA_NAME NOT IN (‘mysql’,’tmp’,’innodb’,’information_schema’,’performance_schema’)” | xargs mysqldump -u [username] -p –add-drop-table –skip-lock-tables –quick –extended-insert –result-file=[path/to/your/dumpfile.sql] –databases
    [/code]
  2. Use sed to remove all the ‘CREATE DATABASE’ and ‘USE’ lines in the dumpfile. This prevents the multiple databases from being created when importing locally.
    [code language=”bash”]
    $ sed -i ” -e’/^CREATE DATABASE /d’ /path/to/dumpfile.sql
    $ sed -i ” -e’/^USE /d’ /path/to/dumpfile.sql
    [/code]
  3. Get the dumpfile to your local machine, and import:
    [code language=”bash”]
    $ mysql -u [username] -p -e “create database foo”
    $ mysql -u [username] -p foo < ~/path/to/local/dumpfile.sql [/code] (or whatever technique you use for mysql imports) (don't know why my code formatter is converting < to &lt; but you get the idea).