Computer lifetimes and the executable web
Published: February 10, 2013Tags: executable web bloatware
Lately, I have been thinking about investing in a small SSD hard drive for my Thinkpad X60s laptop, which is my primary computer at home and when travelling. I got the inspiration for this from this article. While looking at SSDs on Amazon, I came across the following review of one:
It's amazing how we (me included) think an old computer is worthless. Fact is, add a little memory and an SSD and it will be useful for at least another year or more. The Kingston SSDNow V300 is a great value for doing just that. I have now bought 2 of these and they work great. Now my kids Core 2 Duo laptop will last another year or 2.
Of course, I think this sentiment about old computers being inappropriately considered worthless is bang on the mark, but it struck me as curious that even after making this very point that review author goes on to say that his kid's laptop will last "another year or 2".
I think it's a shame that it's not widely considered ridiculous to talk about computers "lasting" a certain amount of time, or "getting old". This is especially true of computers which have an SSD instead of an old spinning magnetic hard drive. Once you've got to that stage, your computer should last forever, or close to it. CPUs, RAM and SSDs don't "get old". They don't physically wear down and get slower. Old computers aren't like old cars. Failing some sort of hardware failure, a computer should be exactly as fast on its 10th birthday as it was the first time it booted it up.
Of course, it's not surprising that people don't think this way. Computers really do feel like they get slower over time. But this is no fault of the computer, which is exactly as performant as it ever was. It's because, in general, software sucks. Some of this is due to poorly written operating systems which slow down over time due to the build of up cruft in databases etc. That kind of slow down can be reversed with a good old format and reinstall (it should be considered a cause for great shame that in 2013, formatting and reinstalling Windows is widely accepted by technically experienced people as something you just have to do to keep things running fast and smooth). But another component, which cannot be so easily reversed, is the fact that we just keep making a computer's job harder by artificially increasing the computational complexity of everyday tasks.
A computer which did an acceptable job of some everyday task 10 years ago should do about an equally acceptable job of that same task today. The actual theoretical computational complexity of, say, word processing, or reading and writing email, does not change. It's a fixed task. But if you offered me a bet that 10 year old email software would consume fewer resources and/or run faster than modern email software, I'd take it without thinking. That's just the way things are, and nobody seems to notice or even care.
Nowhere is this more true than the web. Surfing the web should be a trivial affair. It boils down to reading bytes off the network and then putting some text and graphics on the screen. But it's not that simple anymore. Even something as simple as reading the news headlines or checking the weather today generally requires your computer to download stuff from 5 or 10 separate servers, and a lot of that stuff is executable code written in a Turing complete programming language, so that your computer actually has to think a little bit before telling you the weather. Life with NoScript is miserable, because Javascript is everywhere and often you need to allow scripts from domains with no obvious relation to the website's domain. The web is like a computing world where .txt files have been banned and everybody communicates by sending around tiny little .exe files which contain their text and also a tiny one-off graphics library that your computer uses to actually display that text.
Of course, the "executable web" is not all bad. Javascript and HTML 5 are turning the web browser into a truly platform-agnostic application development platform, and that's great. We've always needed one of those. The problem is that instead of the browser acting as an application platform when it makes sense to (e.g. when using GoogleDocs or something similar) and as a dumb interpreter of formatted text and graphics when that makes sense (e.g. when you just want to read the news), the browser is quickly becoming only an application platform and there are vanishingly few dumb webpages left. This is the reason why modern browsers are such tremendous resource hogs and why computers which are a few years old can sometimes visibly struggle for a short while to render some modern websites - despite having gigahertz clock speeds!