Certificate Patrol: Security vs complexity

Published: May 21, 2014
Tags: firefox https internet ssl web security

Today I uninstalled the Firefox add-on Certificate Patrol. Certificate Patrol implements a technique called "certificate pinning", which is designed to help detect Man in the Middle attacks against HTTPS. The idea is simple: when you connect to a site over HTTPS, you store a copy of the fingerprint of the presented SSL certificate. The next time you connect, the fingerprint of the offered certificate is compared to the copy you have cached. If the certificate has changed, without apparent good reason (e.g. the old certificate is expiring soon) then you raise the alarm bells. It's a neat idea, that feels like common sense in retrospect.

Why am I uninstalling it? Because security measures with high false alarm rates are useless: you end up ignoring them. Certificate pinning has a very high false alarm rate. It is based on the assumption, which feels like common sense at first glance, that SSL certificates are relatively stable things: the certificate you get from foo.com today should be the same one you get tomorrow, and next month, and probably next year. This is false for any major website. The widespread use of content delivery networks means that connections to foo.com go to different actual servers all the time. And those different servers seem to almost always have different SSL certificates. Often they are signed by different Certificate Authorities. Sometimes they are valid for different URLs (e.g. some use wildcards, and some don't. Some seem to be valid for the domain of the CDN company and not the domain being visited, I don't even know how that works). Any heuristic you might come up with for separating MITM attacks from "ordinary" variation is doomed to failure, because the degree of variation day to day is immense.

There was a discussion on Bruce Schneier's blog recently about whether or not antivirus technology is "dead", in which a commenter said the following:

There are plenty of other ways to own a computer, some more sophisticated than others, but with DRM, anti-cheat measures, BitTorrent, NSA-mandated backdoors, and so forth, it's harder and harder to say whether a certain sequence of activities is malicious or operating as normal. It's getting too hard to tell the difference.

Emphasis mine. The practice of denying activity by default and only permitting known safe activity has been long and widely recognised as the ideal approach to security, and is certainly far better than its logical inverse of allowing everything by default and trying to exhaustively enumerate all the known bad things (ideas 1 and 2 on Marcus Ranum's Six Dumbest Ideas in Computer Security. Complexity is the natural enemy of this approach. The modern web is drowning in needless complexity, and the consequences of this are inescapable.

Feeds
Archives
Top tags