Afterbites is a blog segment in which Marcus Ranum provides more in-depth coverage and analysis of the SANS NewsBites newsletter. This week Marcus will be commenting on the following article:
Gartner Report Says Two-Factor Authentication Isn't Enough
(December 14, 2009)
A report from Gartner says that two-factor authentication is not providing adequate security against fraud and online attacks. Specifically, Trojan-based, man-in-the-middle browser attacks manage to bypass strong two-factor authentication. The problem resides in authentication methods that rely on browser communications. The report predicts that while bank accounts have been the primary target of such attacks, they are likely to spread "to other sectors and applications that contain sensitive valuable information and data." Gartner analyst Avivah Litan recommends "server-based fraud detection and out-of-band transaction verification" to help mitigate the problem.
I found this article interesting because it typifies, for me, the end result of the "whack-a-mole" approach to computer security. Certain technologies are sold as "security enablers" but customers don't seem to understand (and/or aren't informed) of the reality: security is a top-to-bottom problem that doesn't have any single place where you can add a widget that'll magically make you safe.
Transaction layer security was seen as a problem around 1996-7, and the result was SSL. Perhaps some of you remember the stillborn s-http, which competed briefly with https; the "debate" there was over how much certificate checking and mutual authentication needed to be in the transaction later. The simpler approach won, and we all got SSL. Unfortunately, the degree to which certificates are validated and checked results in SSL being fairly straightforward to "man in the middle" attack. If I were a paranoid, I'd think it was deliberate. But, we're left with a transaction "end-to-end" security system that, well, simply isn't a an end-to-end transaction security system. I'm surprised that anyone (even Gartner) is surprised that this is a problem. I'm amazed that the security community and its customers have tolerated it. (It's only tolerable, I suppose, because PKI is intolerable thanks to the zealous efforts of the standards bodies).
And, of course, the end-point must be secure, or your transaction security can be bypassed at the end-point. Who cares if you're using 2 factor authentication and an encrypted end-to-end link, if your operating system allows someone to take control over the keyboard, or to load a device driver that can access the memory area in your browser where your encryption keys are held in the clear.
In other words, the title SANS gave this "NewsBite" is almost 100% wrong. Gartner (regardless of how they phrased it) identified not that 2 factor authentication isn't "strong enough" - it's rather more a case of "everything surrounding the 2 factor authentication is too weak" The problem of layering security in networks and operating systems was well-understood in the late 1960s - it's not rocket science and it never has been. What's been happening is that the security industry has been chasing the mythical magical leverage-point where a quick and dirty fix will magically get the whole job done. Gartner is part of that problem. If we're ever going to see progress made, it'll come as part of a consistent attempt to "level up" security across the board:
- in the software (code quality, reliability, security)
- in the operating system (layering, tamper resistance, trustworthy distribution)
- in the browser and transaction layers (non-bypassable transaction layer security, trusted keyboard path, improved key storage)
- in authentication (offline cryptographic authentication processors - smart cards and keyfobs that do not trust the operating system)
In 1997 I suggested that we scrap it all and start over from scratch. I still think it's a good idea (especially when I look at Web 2.0 "frameworks") - but every year that goes by makes the code-load larger and the eventual cost of replacement higher.