Content delivery networks and DNS

“Lisa needs braces!”
“Dental plan!”
“Lisa needs braces!”
“Dental plan!”
“Lisa needs braces!”
“Dental plan!”

I just had one of those moments when all the clues have been laid out before you, but it still takes you forever to put 2 and 2 together and get anything other than 2 and 2.

For the last six months, I’ve been railing against my crappy home Internet connection. That’s no surprise for anyone who lives in this country, but I just couldn’t believe that in 2010, I still couldn’t watch even a single Youtube video without having to pause, and come back in 10 minutes to watch it after it had buffered.

It turns out I had been shooting myself in the foot, or rather, they had been shooting me in the foot.

A while back, Google announced their DNS service.  It’s fantastic, because finally you didn’t have to look up your ISPs DNS server addresses every time you are setting up a computer on the net, and can just use Google’s easy-to-remember addresses of “8.8.8.8″ and “8.8.4.4″.  I started using it right away, because I was also moving between ISPs (from Xnet to Orcon) and it meant that things would change over smoothly. The only cost I thought would be that DNS lookups would have a slightly higher latency than if you used your local ISP’s servers.

However, it turns out there’s a bigger catch: content delivery networks use DNS requests to work out where in the world you are, and redirect you to their closest server. If you use a non-local DNS server Youtube, Akamai, and a whole bunch of others will send you all the way around the world to a far far away server when you are downloading stuff, instead of to a much closer (and usually faster) server. This is not what DNS is designed for, and it’s an ugly hack, but apparently it’s been going on for a while now, I just never worked it out.

This is stupid and annoying, and I can say that because one of the guys who helped create DNS thinks the same. The moral of the story though is that you can be better off using your local ISPs DNS servers instead of ones such as OpenDNS or Google Public DNS. At least under these circumstances.

Welcome to my cloud

I have been playing with Modern Technology. As a freebie from attending the Kiwi PyCon in Waitangi in November, I was given a $50 credit for Amazon Web Services. The timing was perfect because I’ve been needing to play with it for ages anyway. About six months ago I had a crack at setting up a pilot Ubuntu Private Enterprise Cloud on some old hardware, with pretty much zero success. The learning curve is pretty high just on the side of using cloud virtualisation, much less having to administer it yourself without prior experience.

So using Amazon I can get a feel for running virtual machines using the EC2 interface before having to learn how to administer the host machines myself. Which is handy.

This also means I can finally decommission the poor little old Pentium 3 that has been sitting in my basement, humming forlornly to itself, for almost nine years.  It has served me very well, and I think it deserves the right to finally die in peace.

So, now what I have is eastman.net.nz, a virtual machine running on Amazon’s EC2 network using the cheapest available configuration.  And I’ve also enabled  Google Apps to handle the email side of things, so I no longer need to run my own mail server in my basement, limited by my crappy New Zealand DSL connection.

I’m also trying to put all the clever system-administration tricks I’ve learned over the last 10 years or so into practice here.  The idea will be to automate as much as I possibly can, so that the system will basically administer itself. I’m using a combination of Fabric and Puppet to automatically provision and configure the server.  Basically so that I never have to log in personally and issue a single management command. In practice, I’ll have to cheat, but not very often. I’ll write more about that setup in another post sometime. Honestly it’s pretty cool, if you’re in to that sort of thing.

If you run Ubuntu, and unpack the Firefox source code, your computer will die

Updated: I received a notification from Launchpad that the bug has been fixed in an upstream version of w3m.

Every once in a while, for what seemed to be no explicable reason, my work machine (a quad-core 27″ iMac running Ubuntu 10.04) has been slowing down to an absolute crawl, falling over and dying.  It’s been infuriating, and I haven’t had any luck trying to work out what might be causing it. Until now.

As it started slowing down and thrashing, I was able to run ps, only to find this process sitting there chewing up almost all the memory and half the swap on my 4 gigabyte machine:

tom      14334  1.7 53.3 3096296 2159932 ?     DN   \
16:49   0:06 w3m -o indent_incr=0 -o multicol=false \
-o no_cache=true -o use_cookie=false -o display_charset=utf8 \
 -o system_charset=utf8 -o follow_locale=false -o \
use_language_tag=true -o ucs_conv=true -T text/html
-dump /home/tom/firefox-src-jssh/firefox-3.6.6+nobinonly/\
build-tree/mozilla/layout/html/tests/table/bugs/bug141818.html

For some reason, a text-based web browser (w3m) is sitting there going quietly insane while trying to make a dump of an HTML file in my home directory.  That HTML is a regression test that’s sitting in the firefox build tree.  That means it’s a bad file. It’s supposed to make web browsers crash.

The parent process for w3m is “evolution-data-server-2.28″.

I have questions.  For brevity, I shall abbreviate “What the FUCK” as WTF and “Why the HELL” as WTH.

  • WTF is evolution-data-server anyway?  Why do I have it? Why do I need to run it?  I don’t use evolution.
  • WTH are you scanning my home directory?  What are you looking for?
  • WTH is w3m trying to parse random HTML files buried in firefox source trees in my directory?
  • If there is some valid reason for it, then WTH isn’t evolution-data-server putting some kind of resource limit on w3m, to prevent it from crashing the whole damn computer?
  • WTF, man? Just, WTF?

At this point, I’m actually hoping that it’s all much simpler: that somehow my work machine has been compromised, and w3m is actually running a rootkit of some kind.  At least that way it’s just me who’s been monumentally stupid.

Otherwise, the moral of the story is simple:  If you’re running Ubuntu, and you unpack the firefox source code for some reason, your computer will die.