Five years of BuddyPress

I started working with BuddyPress by accident. In February 2009, I responded to a tweet from my friend Matt Gold asking for help with a CSS issue on a site he was working on. That site was the still-in-beta CUNY Academic Commons, running on the still-in-beta BuddyPress. Within a few weeks, I was doing paid work for Matt’s project, working with BP (and WP, and web software in general) for the first time. And BuddyPress 1.0 came out just a few weeks after that.

Over the last five years, BuddyPress has taken over my professional life. I began by writing BP plugins. I started to contribute to BP itself through support and patches. I became a member and eventually a lead on the core team. My consultation work involves BuddyPress almost exclusively; this success (in terms of both money and impact) emboldened me to drop out of graduate school. People know me as “the BuddyPress guy”. When you type “boone gorges” into Google, it suggests “boone gorges buddypress”.

I feel very grateful to have stumbled into the project when I did. It aligns with many of my philosophical and political positions: the primacy of people over content, the importance of data ownership and free software, the fight against parasitic software vendors in public institutions. I’ve met some good friends through my association with BP. I’ve leveraged my expertise into a fun and comfortable career.

But the fact remains that it’s all been a fluke. When I realized it’s been five whole years, I couldn’t shake the thought: WTF. How strange to devote such a large part of one’s life to something that was such an accident. [Something something destiny something something forks in the road something.] I got lucky because I happened to stumble into something that was a particularly good fit for me. But I also took many leaps of faith along the way: agreeing to work on the CUNY Academic Commons when I had pretty much no idea what I was doing, submitting my first patches to BP, quitting my job, upping my rates, donating huge amounts of time to the free project instead of doing paid client work. I’m glad I had the guts to make each of these leaps.

Happy birthday to BuddyPress, and happy anniversary to me. Here’s to many more happy accidents!

Manually copy content and settings between sites in a WP network

I just had a request to copy the contents and settings from one site within a WordPress network to another within the same network. (The destination site is the “staging” version of the source.) Daniel Bachhuber’s Dictator along with the general wp-cli export/import tools are the ideal tools for this sort of thing, but due to some odd circumstances I wasn’t able to use them. So here’s a quick rundown of what I ended up doing. (This post mainly for my own records. If any step below is confusing to you, you probably should not be doing it this way. Use at your own risk!)

  • Get exports of the production db tables (as well as staging, for backup). I ended up crafting the following (614 is the ID of the production site):
    mysql -u [username] -p information_schema -B -N -e "SELECT table_name FROM tables WHERE table_name LIKE 'wp_614_%'" | xargs mysqldump -u [username] -p [database name] --add-drop-table --skip-lock-tables --quick --extended-insert --result-file=[/path/to/dumpfile.sql]
  • I downloaded that dumpfile and imported it into a local database, so that I could run it through https://github.com/interconnectit/Search-Replace-DB to do the necessary URL replacements. (Could’ve used wp-cli, but this way I didn’t need to have a functional local WP installation.)
  • Did a further search and replace to change instances of ‘wp_614_’ to ‘wp_860_’ (the staging site ID)
  • Uploaded that .sql file and imported
  • Next, I had to handle files. Normally this would take 30 seconds at the command line, but permissions were locked down on this server: my SSH user didn’t have proper permissions to modify some of the directories in blogs.dir. So I wrote a quick script that would run the necessary commands in PHP (as the webserver user), implemented as an mu-plugin: https://gist.github.com/boonebgorges/75e3ec70bd5177dab7dd

Again, use at your own risk.

Dumb

Last year I wrote about my decision to remove email apps from my mobile devices. Today I took the next logical step and got rid of my smartphone altogether.

I was giddy when I got an iPhone in 2008. Having email and the web (and later, stuff like Twitter) on a mobile device was the coolest thing ever. But it’s become clear over the last year that the benefits of this connectivity are, for me personally, clearly outweighed by the drawbacks. The smartphone keeps me connected to the internet; I work on the internet; therefore the smartphone keeps me connected to work. And when I’m not at my computer, dwelling on work-related issues is both pointless (because I can’t fix them until I’m at a computer) and annoying (because duh). Even if there were a way for me to carve out a totally-non-work-related part of my online life, I’m not sure I want to have it in my pocket, where I’m always tempted to fiddle with it.

To make the transition a bit more fun, I got myself a legitimately nice dumbphone, the Nokia 515 (which I had to order from a shady-seeming importer, because it’s not supposed to be available in the US). I’m having a good time setting it up. It’s been a few years since I had to migrate my contacts manually, so I’ve built up lots of cruft. The only people I moved over to the new device are those I really like (and might want to call) and those I really don’t like (and want to screen). The camera on the Nokia is pretty good for a dumbphone, but totally lame compared to my Moto X. Using multi-tap to type is hilariously awful, but T9 is better than I remembered. It’s retro-fun.

Using this phone is going to introduce friction into my routine. Messages will be harder to type; appointments will be trickier to look up; addresses will be impossible to locate; and so on. But when I look around a subway car or a restaurant or a playground and see dozens of people gazing vacantly into the easy gleen of their smartphone screens, I remember that friction can be good sometimes.

Any major dude with half a heart surely will be at WordCamp Connecticut on May 10

A few months ago, I had the pleasure of speaking at the WordPress Stamford Meetup, organized by Clint Warren. I musta put a bug in his ear or something, because I got a follow-up email last month letting me know he was organizing the very first WordCamp Connecticut. I’ll be giving a talk about BuddyPress.

The organizers are still looking for speakers, so if you’re a WordPress person in the CT vicinity (Stamford is an easy Metro-North ride from NYC), please consider applying to present! And if you’re just looking to nerd out for a day, add yourself to the mailing list so you’ll know when tickets are available. DO IT

Garmin Forerunner 310XT hacks

I’ve been running with a Garmin Forerunner 310XT for about eight months now. I like it pretty well (running with a HR monitor has totally changed my running for the better, but that’s a subject for another post), but there are a couple really annoying things about it, which I’ve been forced to hack workarounds for.

  • For me, the plastic that houses the transponder on the chest strap caused pretty severe chafing. I think this is something that Garmin is aware of; my wife has a previous version from the same series (the 305, I think), and the strap design does seem improved. But for me, the first month or two was pretty terrible. The chafing was awful, and running four days a week, I never had a chance to heal. I tried all different kinds of lube, tried different ways of positioning the monitor (around the center of my chest vs just under my armpits), played with different levels of tightness. What ultimately ended up working for me was this. I wear it around the narrowest part of my chest, with the strap fairly loose. When I’m running more than five or six miles, I use a bit of runner’s glide. And – this has made the biggest difference for me – I wrapped the big hunk of plastic in a couple layers of athletic tape. It still irritates me a bit, but there’s no more bleeding.

    2014-02-16 15.11.17

  • The watch has this cool feature where you put a little USB nub in your computer, and it’s supposed to auto-download your latest activity as soon as the watch comes into range. This has worked for me maybe five times, tops. Typically, the software doesn’t recognize the watch at all, and for the first few weeks I owned it, I struggled to find a workflow that’d let me store my workouts on my computer. The only way I could make it work consistently is by re-pairing the watch + computer every time I want to download. Here’s what I do when I get back from a run (I use a Mac for this):
    1. Close the Garmin ANT Agent program in the toolbar
    2. Delete the local Garmin data folder: rm -rf ~/Library/Application\ Support/Garmin
    3. Start the Garmin ANT Agent application
    4. From the Garmin toolbar menu, choose “Pair with New Devices”. Within a few minutes, it’ll start re-syncing

    Aside from the general fact that this it’s Extremely Stupid, the annoying thing about the process is that it takes progressively longer to complete the more workouts you have on your watch (because you’re deleting your local cache, it’s got to download all of them each time). So, every few weeks, I delete all activities from the watch. But before doing so – because I don’t trust Garmin’s “Garmin Connect” online service – I make sure to copy the .tcx files from my local directory to some safe location. That way, I have offline access to my running history if I want it. cp ~/Library/Application\ Support/Garmin/Devices/xxxxxxxxxxxx/History/* /some/other/location (where “xxxxxxxxxxxx” is your device ID).

Recommendations for per-project time tracking tools

I don’t bill by the hour very much anymore, but I still like to keep rough track of time spent on individual client projects, for my own purposes. I currently use a simple spreadsheet, with tabs for each project/client. Yesterday I asked on Twitter what tools people were using for this purpose:

Here are some responses I got. I can’t personally endorse anything on this list, but it might be a helpful starting point for others.

Recent Anthologize updates

Anthologize, you are neglected, but not forgotten!

In the past week or so, I’ve done two maintenance releases (0.7.2 and 0.7.3) for Anthologize. A few highlights:

  • Fixed some issues with the way TCPDF saves image files in a temporary cache. This should help to avoid the dreaded “TCPDF ERROR: Can’t open image file” fatal error when exporting to PDF on some server configurations.
  • Fixed some issues with the way that Anthologize’s JS and CSS files are loaded, for better compatibility with other plugins and with SSL wp-admin.
  • Fixed a bug that gave non-admins the ability to change settings on some multisite configurations.

Speaking of not forgotten, I haven’t forgotten my friends who supported my Anthologize campaign back in 2012. This post goes out to Eric A Mann, an outstanding WordPress developer and blogger. Thanks for supporting Anthologize, Eric!

Default Gravatar images and SSL

I have a client who runs a number of WordPress/BuddyPress sites over SSL. He noticed in the last few days that default Gravatar images – the images that Gravatar serves when there is no Gravatar associated with the queried email address – were not being served. The browser showed broken images, and when you attempted to load the associated https://secure.gravatar.com URL in a separate tab, you’d see the message “We cannot complete this request, remote data could not be fetched”.

After a bit of futzing around, I found this recent post by Eric Mann describing a similar issue with the Photon CDN feature in the Jetpack plugin. He managed to figure out that Automattic’s CDN service wasn’t fetching items that were served over HTTPS. (The fact that it ever worked was, apparently, a bug; that “bug” was recently fixed.)

It turns out that the same thing is true for Gravatar’s “Default Image” feature (unsurprising, as I assume it uses the same CDN as Photon). Gravatar lets you specify a local file that will be served if no actual Gravatar is found: <img src="http://www.gravatar.com/avatar/00000000000000000000000000000000?d=http%3A%2F%2Fexample.com%2Fimages%2Favatar.jpg" /> But, as of the last few weeks, if the value of the d= param is served over HTTPS only, Gravatar throws an error.

There are a couple strategies for working around the problem.

  • Use Gravatar’s defaults instead – Gravatar hosts a number of default images that you can use, instead of a local image. This is especially pertinent in the case of BuddyPress. BP’s default behavior is to construct Gravatar requests like this: http://www.gravatar.com/avatar/00000000000000000000000000000000?d=http%3A%2F%2Fexample.com%2Fimages%2Fwp-content%2Fplugins%2Fbuddypress%2Fbp-core%2Fimages%2Fmystery-man.jpg. The thing is that this mystery-man.jpg that ships with BuddyPress is the same image as what you get with ?d=mm. So an easy way around the problem of Gravatar reading from your SSL-protected site is to avoid Gravatar from making any requests to your site at all. In BuddyPress, use the following:
    function bbg_use_gravatar_mm() {
        return 'mm';
    }
    add_filter( 'bp_core_mysteryman_src', 'bbg_use_gravatar_mm' );
    
  • Allow non-SSL access to your default – As suggested in Eric’s post, you can tell your webserver that some of your content can be served over HTTP rather than HTTPS. For example, on one of the sites I’m working on, we force HTTPS for all requests using an .htaccess rule. I can amend it to allow an exception for the custom Gravatar default:
    RewriteCond %{HTTPS} off
    RewriteCond %{REQUEST_URI} !^/wp\-content/themes/yourtheme/images/default\-gravatar.jpg$
    RewriteRule ^(.*)$ https://%{HTTP_HOST}/$1 [R,L]
    

    Then, force BuddyPress to tell Gravatar you want the non-SSL version of the fallback:

    function bbg_custom_default_avatar() {
        return set_url_scheme( get_stylesheet_directory_uri() . '/images/default-gravatar.jpg', 'http' );
    }
    add_filter( 'bp_core_mysteryman_src', 'bbg_custom_default_avatar' );
    

Even if you’re not using BuddyPress or WordPress, the same strategy applies: if you’re serving your whole site over HTTPS, tell Gravatar to use either one of its own images or one of your non-SSL-available images as its default.

Convert multi-db WordPress mysqldump to single-db

On a number of client sites, I use HyperDB or SharDB to spread a WordPress Multisite installation across multiple databases on a single server. However, in my local dev environments, it’s annoying to have thousands of databases. So I use the following technique to create a copy of the remote site that operates in a single database locally.

  1. Use mysqldump to get a backup file. The following command ensures that you don’t pull in information_schema or any other unrelated databases; you can add other DBs to ignore to the NOT IN list:
    $ mysql -u [username] -p -B -N -e "SELECT SCHEMA_NAME FROM information_schema.SCHEMATA WHERE SCHEMA_NAME NOT IN ('mysql','tmp','innodb','information_schema','performance_schema')" | xargs mysqldump -u [username] -p --add-drop-table --skip-lock-tables --quick --extended-insert --result-file=[path/to/your/dumpfile.sql] --databases
    
  2. Use sed to remove all the ‘CREATE DATABASE’ and ‘USE’ lines in the dumpfile. This prevents the multiple databases from being created when importing locally.
    $ sed -i '' -e'/^CREATE DATABASE /d' /path/to/dumpfile.sql
    $ sed -i '' -e'/^USE /d' /path/to/dumpfile.sql
    
  3. Get the dumpfile to your local machine, and import:
    $ mysql -u [username] -p -e "create database foo"
    $ mysql -u [username] -p foo < ~/path/to/local/dumpfile.sql
    

    (or whatever technique you use for mysql imports) (don’t know why my code formatter is converting < to &lt; but you get the idea).

The pleasure of being a lone wolf

The WordPress consulting world has, of late, been all about consolidation and upsizing. Firms get larger by hiring independents and acquiring smaller businesses. Every week I read countless tweets and blog posts about the joys of working for a larger team: the camaraderie, the efficiencies of scale, the pleasure of getting a regular paycheck. And as a longtime solo freelancer, I’m very sensitive to the shortcomings of being a lone wolf.

At the same time, I try not to forget the beauty of going it alone. First and foremost is the flexibility. Aside from my personal expenses, I have no payroll and practically no overhead. I’m able to turn down work that seems unpleasant or doesn’t jibe with my philosophical predilections, even when the work would pay very well. If I have a good few months and feel like not taking on any more projects for a while, I can do so. I can spend as much time as I’d like working pro bono on free software. On the flip side, when someone approaches with a project that sounds fun but might not pay well, I can take it, guilt-free.

Flying solo also means that I’m constantly being integrated in and out of new teams and projects. I’m constantly exposed to new workflows, new tools, new technologies, new ideas. Sometimes the rootlessness feels lonely, but it can also be exhilarating.

It’s not so bad being a loner.