Nmap Development mailing list archives

RE: Updater Investigation


From: "Rob Nicholls" <robert () robnicholls co uk>
Date: Fri, 3 Jun 2011 07:34:35 +0100

Hi Colin,

I'm pretty sure that Windows Update/Microsoft Update can only be used to distribute Microsoft updates or WHQL signed 
driver updates from third parties; they currently do not support third party applications (and probably never will!). 
This is why Oracle, Adobe etc. all have their own update mechanisms on Windows. This is a bit of a shame as MU/WU ties 
into WSUS, which  is a pretty good free system for administering patches (e.g. automatically approving certain updates).

Although you haven't stated it, I'm, hoping that when you talk about updating binaries on Windows you're also taking 
into account updating WinPcap and the Visual C++ Redistributables where required. Minor updates to Visual C++ would 
most likely be handled by Windows/Microsoft Update anyway (so could potentially be de-scoped?), but when we changed 
from building Nmap with VC++ 2008 to 2010 (for example) we had to stop supporting it on Windows 2000 - equally, when we 
moved from 2005 to 2008 I think we hit issues where certain Microsoft patches had to already be installed for the 2008 
vcredist to install properly on 2000. I can't see it happening yet, but perhaps in the next few years we'll move to a 
newer version of VC++ when XP is no longer supported by Microsoft and we'll need to be careful not to break existing 
setups.

Will there be a rollback mechanism if the update goes wrong? At the moment the Windows installer simply warns about 
failed installs of vcredist, for example, and leaves it to the user to fix any issues before they launch anything.

If it's of any use, I do have a batch file version of the existing Makefile on Windows (to save me installing cygwin), 
but it uses 7-Zip to do the zip compression (although that's only there to create the Nmap zip file) and I have to 
manually update the version number at the moment. I can send a copy to the list if it's of any use, but we'd still need 
people to install Visual C++ 2010 Express Edition if building from source and this could - and probably should IMHO - 
rule this out as a viable option (and if Nmap upgrades then we need a way to notify users that they need to upgrade 
their copy of VC++ too).

Rob

-----Original Message-----
From: nmap-dev-bounces () insecure org [mailto:nmap-dev-bounces () insecure org] On Behalf Of Colin L. Rice
Sent: 02 June 2011 21:46
To: nmap-dev () insecure org
Subject: Updater Investigation

David asked me to investigate more potential updating solutions with regards to writing an auto-updater for nmap.

The specific requirements so far are:
1) It must be possible to verify the integrity of update metadata (e.g.,
  latest version number).
2) It must be possible to verify the integrity of package contents.
3) It must be possible to authenticate the package contents.
4) The system must run on Windows, Mac OS X, and Linux.
5) The key binaries nmap as well as any secondary binaries, ndiff ncrack etc.. must be updated if installed.
6) The key files nmap-os-db, nmap-protocols etc.. plus all of the nselib lua files and the scripts should be updated as 
well. These may be in a separate repository from the binaries since they are cross platform.

The first option is to use source control. In our case SVN. Other projects use the same approach such as Metasploit or 
GO. It is possible to set up SVN using https and ssh tunneling to derive trust from the root server. This fulfills 
requirement #3. Requirements 4,5,6 can all be fulfilled as well although it may require the inclusion of binaries in a 
svn repository.

Insuring the integrity of the update metadata as well as the package contents is done by verifying either the https 
certificate or the ssh root key and then the aforementioned tunneling. If you are not using ssh tunneling then it is 
quite difficult to ensure that the package contents are correct since there could be a man in the middle serving you 
incorrect svn diffs for the changed files.

However SVN does have the advantage of already being in place and it is used as our current development environment. It 
would require that we either include binaries in the Subversion repository or that in order to update nmap you must 
have all needed dev tools. This could make it especially difficult to update nmap on windows without cygwin or visual 
studio installed.

The second option is to follow the lines of OpenVas and use wget or rsync. There was some discussion of this previously 
with regards to syncing scripts. If you use https you can authenticate the identity of the server. You can also run all 
of these programs on Windows, Max and Linux. It is possible to set up binaries as well as scripts. However again the 
issue becomes authentication of the update metadata and the package contents. While normal network errors will be 
corrected there is no default protection against slow-replay attacks or man in the middle attacks let alone more 
complicated attacks.

On the positive side however it is rather quick to set up and easily thrown away if something better comes along.

The next option is setting up our own yum/apt repositories as well as using the Mac OS X app store and windows update.

This allows us to update both the scripts and our binaries as well as deriving trust from the root server. I am unsure 
as to how hard it is to attack these update systems but given that they are the default update systems it will be no 
less insecure than the rest of the system. While this does not really answer the question it is true that apt-get and 
yum have keys for authenticating packages and Apple has authentication as well for the app store.  It appears there is 
something similar to this for Microsoft windows. The issue is that it would require multiple update setups for each 
platform as well as dealing with both MAC and Microsoft's update systems. We would have to get approval and I have no 
idea how you get Microsoft to update your application. They have the capability in windows 7 but it does not appear to 
be documented at all.
All of the MAC and Windows updates also have the potential to be glacially slow with regards to pushing out updates and 
they would probably not like us updating the scripts on a daily or weekly basis.

Another option is to do something similar to what Firefox did. Firefox set up their auto updater to use the Firefox 
network stack and run in the background while Firefox runs. Additionally they set up the update urls to include 
everything from build ID's to platform version and local in order to remove client side processing. Their update 
metadata is simply packaged xml with signatures for the downloaded file and the update versions. Additionally the 
download servers have https support although it is not rigorously enforced.

When the update meta data and the patch data have been downloaded and verified with signatures Firefox starts the 
updater program and shuts itself off. For minor patches all that happens is a bsdiff but for major patches they have a 
custom format loosely based of of XPI(a psuedo updater for Firefox before 1.5) for adding and removing files as well as 
updating existing files.

Firefox clearly handles different platforms and distribution channels quite well and is wildly successful. It ensures 
integrity of the download. It does not seem like the downloads derive trust from the root key. However it is otherwise 
a well thought out system. It also ensures that we are able to verify the update metadata via version number both of 
the update and of the prior version.

The final option has been mentioned already which is using TUF. TUF handily solves all of the security related issues 
although it has not faced any sort of vigorous review so it may handle them badly or incorrectly. I don't know enough 
to make that determination. In terms of integrating it to distribute the binaries it is certainly possible. 

It is designed to easily plug into distutils but it is also possible to run in a standalone version. The server side 
setup is quite easy and well documented using their quick-start script. The client side setup is completely 
undocumented except in the source code. While the initial investment of time will be higher than any of the other 
possibilities besides rolling our own distribution server for Apt, Yum, Mac's App Store, and windows update, it looks 
like it is actually fairly simple. The provided utilities will update all files provided you tell them the location and 
give them a list of keys. In order to do both binary and script updates you simply have to modify the target list in 
order to differentiate between binaries for different systems.

Doing an update on the server should be simple and requires running one python script. There is actually a tuf server 
root already sitting in nmap-exp/colin/updater/server_root which took me about 3 minutes to make. The biggest issue 
with TUF is the same as the issue for everything else. We are going to need to build the binaries for all supplied 
platforms on the server every time we do a release. Additionally we are going to need to put some files on the web 
server.

Tl;dr
In Conclusion:
None of these systems have been reviewed for security. They all seem to have security solutions that may be susceptible 
to attack but are better than nothing. You can meet all of the requirements with each of them.
TUF looks to be a little more complicated on the client side much simpler on the server side. Using the OS update 
channels for everything except Windows may be possible. Windows is a black hole.

Thanks, Colin

_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/


_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/

Current thread: