Nmap Development mailing list archives

Updater Investigation


From: "Colin L. Rice" <ricec2 () rpi edu>
Date: Thu, 02 Jun 2011 13:45:31 -0700

David asked me to investigate more potential updating solutions with
regards to writing an auto-updater for nmap.

The specific requirements so far are:
1) It must be possible to verify the integrity of update metadata (e.g.,
  latest version number).
2) It must be possible to verify the integrity of package contents.
3) It must be possible to authenticate the package contents.
4) The system must run on Windows, Mac OS X, and Linux.
5) The key binaries nmap as well as any secondary binaries, ndiff ncrack
etc.. must be updated if installed.
6) The key files nmap-os-db, nmap-protocols etc.. plus all of the nselib
lua files and the scripts should be updated as well. These may be in a
separate repository from the binaries since they are cross platform.

The first option is to use source control. In our case SVN. Other
projects use the same approach such as Metasploit or GO. It is possible
to set up SVN using https and ssh tunneling to derive trust from the
root server. This fulfills requirement #3. Requirements 4,5,6 can all be
fulfilled as well although it may require the inclusion of binaries in a
svn repository.

Insuring the integrity of the update metadata as well as the package
contents is done by verifying either the https certificate or the ssh
root key and then the aforementioned tunneling. If you are not using ssh
tunneling then it is quite difficult to ensure that the package contents
are correct since there could be a man in the middle serving you
incorrect svn diffs for the changed files.

However SVN does have the advantage of already being in place and it is
used as our current development environment. It would require that we
either include binaries in the Subversion repository or that in order to
update nmap you must have all needed dev tools. This could make it
especially difficult to update nmap on windows without cygwin or visual
studio installed.

The second option is to follow the lines of OpenVas and use wget or
rsync. There was some discussion of this previously with regards to
syncing scripts. If you use https you can authenticate the identity of
the server. You can also run all of these programs on Windows, Max and
Linux. It is possible to set up binaries as well as scripts. However
again the issue becomes authentication of the update metadata and the
package contents. While normal network errors will be corrected there is
no default protection against slow-replay attacks or man in the middle
attacks let alone more complicated attacks.

On the positive side however it is rather quick to set up and easily
thrown away if something better comes along.

The next option is setting up our own yum/apt repositories as well as
using the Mac OS X app store and windows update.

This allows us to update both the scripts and our binaries as well as
deriving trust from the root server. I am unsure as to how hard it is to
attack these update systems but given that they are the default update
systems it will be no less insecure than the rest of the system. While
this does not really answer the question it is true that apt-get and yum
have keys for authenticating packages and Apple has authentication as
well for the app store.  It appears there is something similar to this
for Microsoft windows. The issue is that it would require multiple 
update setups for each platform as well as dealing with both MAC and
Microsoft's update systems. We would have to get approval and I have no
idea how you get Microsoft to update your application. They have the
capability in windows 7 but it does not appear to be documented at all.
All of the MAC and Windows updates also have the potential to be
glacially slow with regards to pushing out updates and they would
probably not like us updating the scripts on a daily or weekly basis.

Another option is to do something similar to what Firefox did. Firefox
set up their auto updater to use the Firefox network stack and run in
the background while Firefox runs. Additionally they set up the update
urls to include everything from build ID's to platform version and local
in order to remove client side processing. Their update metadata is
simply packaged xml with signatures for the downloaded file and the
update versions. Additionally the download servers have https support 
although it is not rigorously enforced.

When the update meta data and the patch data have been downloaded and
verified with signatures Firefox starts the updater program and shuts
itself off. For minor patches all that happens is a bsdiff but for major
patches they have a custom format loosely based of of XPI(a psuedo
updater for Firefox before 1.5) for adding and removing files as well as
updating existing files.

Firefox clearly handles different platforms and distribution channels
quite well and is wildly successful. It ensures integrity of the
download. It does not seem like the downloads derive trust from the root
key. However it is otherwise a well thought out system. It also ensures
that we are able to verify the update metadata via version number both
of the update and of the prior version.

The final option has been mentioned already which is using TUF. TUF
handily solves all of the security related issues although it has not
faced any sort of vigorous review so it may handle them badly or
incorrectly. I don't know enough to make that determination. In terms of
integrating it to distribute the binaries it is certainly possible. 

It is designed to easily plug into distutils but it is also possible to
run in a standalone version. The server side setup is quite easy and
well documented using their quick-start script. The client side setup is
completely undocumented except in the source code. While the initial
investment of time will be higher than any of the other possibilities
besides rolling our own distribution server for Apt, Yum, Mac's App
Store, and windows update, it looks like it is actually fairly
simple. The provided utilities will update all files provided you tell
them the location and give them a list of keys. In order to do both
binary and script updates you simply have to modify the target list in
order to differentiate between binaries for different systems.

Doing an update on the server should be simple and requires running one
python script. There is actually a tuf server root already sitting in
nmap-exp/colin/updater/server_root which took me about 3 minutes to
make. The biggest issue with TUF is the same as the issue for everything
else. We are going to need to build the binaries for all supplied
platforms on the server every time we do a release. Additionally we are
going to need to put some files on the web server.

Tl;dr
In Conclusion:
None of these systems have been reviewed for security. They all seem to
have security solutions that may be susceptible to attack but are better
than nothing. You can meet all of the requirements with each of them.
TUF looks to be a little more complicated on the client side much
simpler on the server side. Using the OS update channels for everything
except Windows may be possible. Windows is a black hole.

Thanks, Colin

_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/

Current thread: