Gunnar’s Foresight on Google’s Security Hole

Gunnar Peterson is as prolific as he is knowledgeable. It seems like an old post now, due to his steady stream of great content, but a few weeks ago he made some prescient comments about the problems that come from custom security software. The post is humorous as well as insightful and explains why not enough IT budget is spent on software security.

Golfball

He makes a strong point that other security areas such as network security get the bulk of the IT budget, but that not enough is spent on ensuring software security. It’s just not taken seriously. Hence the name of the post, “Golf Driven Security”, referring to his estimation that the IT decision makers play golf with their network security product sales rep — and sign up for their renewals while determining that internal folks can roll their own software security.

Rolling your own custom code in software security is dangerous business.

At the end of the post, he says this:

People don’t write their own virus protection, but for some reason attempt to do their own input validation, it is the same exact problem. People routinely write their own authentication, authorization and audit. I could go on.

Gunnar’s post was written on Aug 24. A Few weeks later ZDNet published an article describing a security vulnerability in Google’s implementation of the SAML 2.0 Browser-based SSO protocol. The Google coders had rolled their own software security.

Personally, I found the original paper about the flaw rather heavy on formalism, but I can see that it makes an important point: if we can formally define such protocols and their possible attacks, we can automate analyses to find flaws in the protocols.

On the other hand, the flaws actually found so far seem very basic. For example:

Any authentication mechanism should not be transitive. If I authenticate to service A, service A should not be able to authenticate to service B as me. Of course, any system that accepts usernames and passwords can often do exactly that, which is one reason username/password authentication sucks at service provider sites. If service A is evil, it should not be able to impersonate me. The Google implementation essentially assumed that all Google services and their partners were good. That seems to be a little beyond even the Google code of conduct.

Kim had a strong reaction to the emphasis of the reporting – pointing out the importance of partitioning (which Google’s implementation disabled) as a defense against insider attacks. Many identity bloggers commented and offered helpful suggestions to the Google coders.

It wasn’t a flaw in the SAML 2.0 protocol, it was in Google’s implementation. They rolled their own security code and, in this case, omitted some important aspects of the protocol.

I found all this interesting because I have been working on a project for the past few years that specifically aims to build reusable open source components that provide solid authentication, authorization, and auditing capabilities to network services. I agree with Gunnar that it’s surprising that people don’t seem to care much about software security. Our approach is that, if we make it easy enough to enable applications with such capabilities – if it’s easier for developers to use Bandit code than to roll their own, we can avoid many security holes. And, for my employer, sell more identity management products.

Perhaps our project motto should be:

Don’t roll your own Authentication, Authorization, and Audit code. Steal it from others. Be a Bandit.

Gunnar’s “Golf Driven Security” post concludes with this statement:

I have rarely seen an industry so ripe for disruptive innovation as software security.

Indeed.