The Digital Economy Bill – A Cryptographer’s View

I love cryptography. If you’ve ever received email from me you know I sign all my email messages with OpenPGP; many of you share keys with me and we exchange all our emails in encrypted form.

Yesterday, I went and bought an Ipredator subscription for 15 euros. Ipredator is a service that provides a PPTP Virtual Private Network endpoint in Sweden, anonymized and encrypted from your computer to the exit node.

Yesterday, the Digital Economy Bill was passed into law.

These two events were not unrelated. Why did I go and set up all of my internet traffic to be encrypted and exported from the UK before it gets released onto the internet? Because perfectly legal things that anyone on the internet does on a daily basis is being criminalized. Websites like YouTube and Google have the potential to be blocked at this point, based on rights holders (The BPI and pals) accusing sites of being likely to be used for copyright infringement. Awesome.

Not only that, but any connection in the UK which is accused of having copyright infringement associated with it (note there is no requirement for evidence, and this relates to a whole connection, not individuals) has to be disconnected by their ISP.

So, I’m turning to cryptography to cover my ass. Because if everything coming in and out of my connection is encrypted till it hits somewhere in Sweden at which point it has no actual traceable relation to me individually, then any accusations of filesharing _must_ be wrong because there’s no way they could know that (and for the record, I download the occasional TV show and Linux distributions using BitTorrent).

And I absolutely think that a little dash of cryptoanarchism in the UK would be a fantastic thing right now. Spread the word on cryptography and VPNs, get Tor and I2P in more mainstream use with your friends and family, you name it. Let’s face it- nearly everyone who can use BitTorrent or anything else to infringe copyright can use cryptography to hide that without much effort or expense, and there’s no reason why people who don’t infringe copyright shouldn’t use it. We all use cryptography every day of our lives for online banking, shopping, or just looking at some sites which default to using SSL (I use Github for example, which bundles SSL with their paid accounts). The government can’t regulate what it can’t see, and it can’t make bills to regulate cryptography out of existence. It’s a solution, though a tricky one.

I’ve already started thinking- what about cheap, mainstream-friendly VPN appliances? I’m not the only person to think of this it looks like- there’s a fair bit of discussion about this. It’d be great- imagine a £50-75ish bit of kit, you buy it, you get the hardware, some bundled months of VPN access, which you can add onto whenever you want to- if it were a polished and easy to set up (plug into wall, network cables between your existing router/modem and your computer(s), turn on, make account, done), then it’d have a chance of becoming a way to deliver cryptography against not only government and ISP snoopers, but also would provide security for people at places like university halls of residence, shared homes, etc. Heck, I use Ipredator on my iPod to encrypt anything going over wifi while I’m out and about using public wifi spots and campus wifi. I’m not concerned about being snooped on, but why not? (A bit of discussion came to the conclusion that a custom firmware for a WRT54GL would be the way to go).

The point is that those who do infringe copyright will always be one step ahead of the curve technologically. I absolutely predict VPN tunnels will be the next big thing for BitTorrent users and legitimate users alike. And what will the government do then? Cut off anyone using a VPN? There go business users working from home. Cut off anyone with lots of internet usage? (Not that ISPs aren’t trying to do that anyway. I’m looking at you, PlusNet!) You’d cut off half the UK, including anyone who used iPlayer. VPNs are the way to go, though the number of providers could do with increasing.

And there’s no chance the government can keep up- and why should it? At the end of the day, copyright needs to be reformed to take the internet into account. This is the only way that the problem will ever be solved from a legal standpoint- trying to win this war with technology won’t work for the government. Deep Packet Inspection hardware works till you slather everything with crypto. Disconnection notices work till you realise that all the pirates are using dynamic IPs and the ISPs don’t keep track of who has what IP at any given moment (and if they do, why? Do they have a legal onus to do so?)IP of course actually stands for Internet Protocol, and even if DPI or Disconnection worked, you’re still not fixing the problem, and you’re still causing huge inconvenience for all the users of that connection.

I’d like to think that our ministers are vaguely understanding what all this means and that at least the guys in charge of all this know their technical stuff. Alas, this has been revealed to not be the case; The Rt Hon Stephen Timms MP, Minister for Digital Britain, revealed in this letter that he believes the term “IP Address” to mean “Intellectual Property Address”. I feel that it’s unlikely that he’s confused IP addresses with a URN scheme or anything, and that he really does think that’s what IP stands for. If you don’t know, it actually means Internet Protocol, because it’s the underlying framework most of our internet relies on for communication between computers. It’s the sort of thing you cover in the first lesson if you’ve ever been taught anything about networking.

With this absolute failure to have knowledge where it’s needed in the current UK political system it’s even more important that the Digital Economy Bill be removed as soon as humanly possible (if that is possible; I’m not a lawyer) or at least be heavily amended. And in the meantime, we should be encouraging MPs to learn the basics, voting for those who will make the right decisions (looks like LibDem for me), and spreading the word about all this as fast as we can. And a little cryptoanarchism wouldn’t hurt, either.

And on that note, I’m going to go see what it’d take to set up a free VPN endpoint on one of my underused VPSes.

http://www.openrightsgroup.org/campaigns/disconnection/why-care

Varnishing over varnish

Well, we’ve tried working with Varnish and we’ve given up. After desperately trying to make Varnish play nicely with everything else on the system, we’ve given up and removed Varnish from our application stack entirely. Why? Memory architecture.

Part of the documentation on Varnish’s website is a long architectural explanation that the OS should handle what stays in RAM and what gets swapped to disk, and that Varnish thus should not do any memory management as such. There is a problem, here, however. This design means Varnish will basically assume that the OS will handle contention between itself and other programs.

This is not a smart move. First off, some OSes are terrible at that sort of thing. Linux is pretty good. But here’s the real issue; take a database server like PostgreSQL. PostgreSQL correctly lets the OS handle disk caching rather than replicating efforts internally. This is a great move and means that you don’t have to guess how much RAM you can let PostgreSQL take up for disk caching; the OS handles it all. Since it’s just caching, sometimes that space can be reallocated to programs which need some RAM, and later given back to PostgreSQL (or any other app).

varnishd was regularly climbing to around 4-6 gigabytes of RAM usage, forcing even application memory into swap, and completely removing any memory from the OS for disk caching, having a terrible knock-on impact on performance of PostgreSQL on the same machine. I should point out that the 4-6 gigabyte figure was obtained while running varnishd with a 1 gigabyte disk cache.

Basically, if you want to run Varnish (and there are many good reasons to; it’s a fantastic cache server other than this issue) you need a dedicated machine to run it. The architecture of the software makes it impossible for it to coexist on a server with other programs. We even tried having Monit restart it when it reached 1 gigabyte of RAM usage, but it still had a terrible impact and the caching was impacted by it. While having a 45% cache hit on Varnish was a lovely thing, and helped reduce load on our backend servers, it was slowing the backend servers down enough for that to not really work out at all.

With the 1 gigabyte of RAM we freed by removing Varnish, we’ve added four more application servers to EVE Metrics. These are more than coping with demand, and we’re happily seeing things stay nice and stable even with a lot of API accesses. So far, then, so good.

On a side note, users of the popular accVIEW application will be happy to know I’m spending a chunk of time this weekend improving the app and adding some very much needed features, like persistent API key storage for users so that corporate security can be maintained even when people leave corporations or join new ones, forgot password features, and performance improvements.