Moving from Apache to Nginx – wow!

I have been meaning to switch this site from Apache to Nginx for a while now – actually, to tune this website a bit better as a whole. I’ve been using various tools (Google PageSpeed, http://www.monitor.us, http://www.pingdom.com) to track the results and I thought I would recap how things have gone.

When I first started this project, I had a plain vanilla WordPress with no optimization running a handful of plugins on an old Dell 1U Solaris server using Apache+mod_php. PageSpeed showed my initial score to be 75. An infrequent monitor.us performance monitoring was telling me the average response time was about 650ms as a baseline.

My first step was to resize all the “system” images (NOT the gallery images). This bumped the PageSpeed score up to 77.

I tried installing W3 Total Cache, but the whole install was a mess. It wants to run to /wp-content, requests 777 permissions, etc. I had problems with the page display getting messed up, even after clearing the cache. Even though the pages were messed up, the PageSpeed score went to 81. I’m sure I could have figured how to actually get it up and running the way I wanted, but I wanted to do all this by hand anyway, so I scrapped Total Cache and moved on – back to Page Speed score 77.

Next up – enabling mod_gzip on Apache. PageSpeed score increased to 81 – not bad for a simple modification.

I was going to try and install mod_pagespeed from Google, but I had no luck getting it to compile with gcc under Solaris. I love Solaris, but I swear I need to switch to Linux one of these days. None of the new software compiles under Solaris. My list of stuff I can’t get to compile is long: mongodb, mcrypt, mysql-5.5.x, newrelic – and those are just the relics sitting in my /usr/local/src directory that I haven’t deleted in frustration.

Next I installed some object caching via APC, but the PageSpeed score stayed at 81. Adding sendfile and MMAP via the Apache httpd.conf bumped it up to 83.
It was at this point that I started a trial of Pingdom to track the page response times on a more granular level. However, Pingdom was showing page response times of around 975ms, which surprised me since monitor.us showed a drop from 650ms to around 587ms. I wish now I had a baseline number from Pingdom before starting any of this, but oh well.
Next up was the big change – moving from Apache to Nginx+php-fpm. After getting everything configured and tested, I switched. WOW! PageSpeed reports 94, and Pingdom reports response time dropping from 975ms to 237ms (see graphic). Talk about immediate results!
There’s more tuning I plan on doing (like implementing a CDN via S3), though I think all the big gains have been achieved. I’ll update the blog as I get the results.

Pho Fuchsia in Pioneer Square

I’ve been on quite the “office shopping” trip lately, first trying out Office Nomads for a couple of months (details here), and now here at SURF Incubator. Part of the excitement of a new place is trying out all the nearby restaurants. There are quite a few options so far, but I’ve struggled to find a good pho/banh-mi joint close by (I need to explore the International District much more).

But I think I’ve found a very good stand-in for my pho cravings, Pho Fuchsia. I was pleasantly surprised at a solid chicken banh mi – not as good as Pho Mimi on Aurora, but better than anything else I’ve tried down here. And the pho tai (rare beef) was delicious. Broth was dark and flavored well. Beef was good quality and accoutrements were acceptable.

Service was solid, although the server did warn me that it gets massively busy during the lunch rush (I’m not surprised).

At any rate, this will probably become a once or twice a week spot for me – I definitely recommend it.

How to process your RSS feeds quickly and easily

This blog post has been a long time coming, but better late than never!

I’ve talked to a lot of people who say they can’t keep up the flood of information these days, so I wanted to share my system for processing RSS feeds. I’ve got about 600 (!) active feeds that update at various intervals, but it’s certainly a chore to keep up with them.

I use my iPad 3 with Feeddler Pro to quickly scan feeds. I’ve tried almost every RSS feed reader on the iPad, and Feeddler Pro’s interface is the best for skimming/scanning feeds. It interfaces with Instapaper (another one of my favorites) for saving a long-form article in the cloud that I want to read later (on my iPad, my iPhone, my laptop or my desktop).

However, sometimes there are articles that I specifically want to open and read on my desktop (or laptop), and saving in Instapaper doesn’t quite work for me. Luckily, Feeddler Pro’s interface allows you to “star” (or mark as favorite) articles trivially easy. So you can zip through your feeds, starring anything that’s interesting.

My problem was I never remembered to go back and dig out those articles from the Starred Article section.

So I wrote a script (https://github.com/heybige/greader) to log in to my Google Reader account, grab (and delete) all the starred URLs and send them to me in an email via a crontab every day at 11:59pm:

59 23 * * * ( /usr/local/bin/daily.php )

For those of you technical enough to understand crons and PHP, enjoy!