Secure Coding mailing list archives
Source code hiding doesn't work (was: Re: State Department break-in last summer)
From: dwheeler at ida.org (David A. Wheeler)
Date: Mon, 23 Apr 2007 12:08:42 -0400
Florian Weimer said:
The times when you couldn't get source code for proprietary, off-the-shelf software are over. Welcome to the new world order! 8-)
Amen, and please let me add a few additional comments, because I'm afraid a lot of people "outside" don't really understand why this "hiding of source code" is a waste of time from a security vulnerability point-of-view. Here are my thoughts on the topic, which I hope some other people will agree on. This notion - that merely "hiding the source code" really prevents vulnerabilities from being discovered - is (I believe) nonsense: * Attackers typically don't need to examine source OR binary. Just sending data to the program, and seeing what it sends back, is often revealing enough to find a vulnerability. This is why "fuzz" testing, and sending malicious data to websites, work so well. Add in tools to observe in detail what happens with various inputs & outputs, and/or user documentation, and the attacker simply has tons of data to work with... they don't NEED more. * If attackers want to examine something using static analysis, examining the binary is often enough. There are tools that can examine binaries for patterns. * If attackers want to examine source, they can reconstruct it "enough" from binaries (using decompilers) to use source-based approaches. Yes, the resulting source code would be hideous to maintain, but an attacker doesn't NEED to maintain it. An attacker just needs to attack the program, which is much easier (only one vulnerability needs to be found, etc.). * If they don't want to bother decompiling, they can often buy the source code, get a license for it, or steal it from those who have it. Lots of vendors make source available. Note that you can typically buy the binary (or access the website), and you can often buy and decompile the binary, so there's typically no barrier for the first methods. Now, there ARE reasons in some cases to hide source code. I believe hiding the source code is all about restricting the ability to MAINTAIN a program to a privileged organization, in order to (1) collect payment for use as a proprietary program and (2) make it harder for others to develop competing products (because they'll have to redo a lot of that work). For _financial_ purposes, such hiding is sensible if you're selling a closed-source (proprietary) program. As long as you're clear about THAT being the rationale for hiding source code, you're at least being level-headed. But don't confuse that economic rationale with providing any _security_ benefits against vulnerabilities in the program. I believe the latter is nonsense. Hiding the source code to prevent the revelation of vulnerabilities is just another form of security by obscurity, and I believe that security by obscurity is a horrifically bad basis for security. Not because it can't work - in theory it can - but because it's notoriously difficult to TRULY obscure something, and you don't normally know when you've been compromised (so when you've been had, you still think you're safe). It's so hard to PRACTICALLY implement all the secrecy required in many cases that security-by-obscurity is often impractical. That's REALLY obvious for software. If you were REALLY serious about security through obscurity (hiding information about a system in order to protect it from serious adversaries), you'll need to do the following: * Ensure that only a VERY few can get the source code. Locked rooms, no Internet connectivity for machines with the source code, a few people with extensive background checks, no USBs or writable CDs. * Ensure that only a VERY few people can get the executable files. Think ROMs with black goo on them, etc. Obviously, you can't sell binaries to more than a few people, and only after extensive background checks of the one running the program (!). * Ensure that only a VERY few people can send data and receive data back from the program. Think running in a closed facility, with personnel you trust controlling the inputs and outputs. Preferably with monitoring systems. Can you enforce all these requirements using a traditional COTS product, either closed or open source? Don't make me laugh. You can't set up a website or sell an executable file - never mind distribute the source code - while still retaining the necessary amount of obscurity for "security by obscurity" to have a PRAYER of succeeding. I think this is why so many people still try to do "security by obscurity", even though it keeps failing so spectacularly in typical software projects. In some other fields you can hide the necessary information pretty easily, so people with those mindsets try to apply the approach to software. But to work in software, the amount of obscurity required is typically impractical; they're mislead into thinking that merely hiding the source code is all there is to it, and it's not. Can COTS programs, be they proprietary or open source software, be made secure? Yes. But "security through obscurity" approaches are completely doomed to failure for COTS, because it's economically insane to try to develop COTS products while still retaining the amount of obscurity necessary for "security by obscurity" to work. Instead, you need to concentrate on techniques that actually produce secure software (design to reduce trust, multi-person review, etc.). In theory this COULD work for in-house software (military software, that sort of thing). But you have to REALLY hide it, which is really hard to accomplish. And one sale of the device "outside" the organization, or one insider who releases the information, could suddenly cause horrifica vulnerabilities, without anyone realizing it. Better to avoid having the vulnerabilities in the first place. The trick is to get others to understand that. --- David A. Wheeler
Current thread:
- Source code hiding doesn't work (was: Re: State Department break-in last summer) David A. Wheeler (Apr 23)