Nmap Development mailing list archives

Re: [NSE] [patch] Big changes to http-enum.nse


From: Patrik Karlsson <patrik () cqure net>
Date: Sun, 17 Oct 2010 09:23:35 +0200


On 17 okt 2010, at 07.44, Ron wrote:

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hey all,

I spent today making improvements to http-enum.nse, and I'm happy to say that they're working perfectly. I'm 
attaching a .patch, but here's an overview of the changes:

o I changed the http-fingerprints database to a new format. It no longer attempts to match the 'yokoso' project; 
instead, I made the database compatible with Nikto. Although we can't distribute Nikto checks without permission, 
this will let users use them anyway

o The http-fingerprints database is now capable of searching for text within a page (either include all pages with 
text, or don't include any with text). Additionally, it uses Lua patterns and can perform captures, then display the 
captured data back to users. Although we don't have any checks yet, this can potentially give us powerful -sV-like 
functionality against HTTP servers

o All pages found (with a 200 response) are stored in the registry. This will let later scripts use the pages as a 
seed for, for example, spidering (I'm hoping to start working on a primitive spider soon, too, that we can work off 
of). I haven't decided on the exact format for storing HTTP stuff in the registry yet, but it's a start. The two 
functions I put at the top of http-enum.nse can probably be moved to stdnse.lua in the future

o On that topic, I moved get_hostname() from a local function in http.lua to a public function in stdnse.lua. I can 
see it being used for more than just internal http stuff. 

o I updated the arguments to use more modern conventions ("http-enum.<argument>") and they're all read with 
stdnse.get_script_args() now

o Added a more generic version of pGet() and pHead() to HTTP that can make a request with any verb. I updated pGet() 
and pHead() to use that function. 

p Added response_contains() to http.lua. It searches the response's status line, headers, and body for the given 
string and returns success (any any captures) if it's found. 

o Updated documentation all over the place. 

For what it's worth, after combining my various fingerprint files, we have 999 web fingerprints. So close to an even 
number! 

Against my server, and across the Internet, it takes only a few seconds (5-10) to do the whole thing. 
scanme.insecure.org doesn't seem to like pipelining, so it takes about a minute. Not terribly bad, in my opinion. 

Anyway, let me know if you think it's good to commit. I'm hoping to start some preliminary work on http-spider.nse 
tomorrow, see how far I make it before I get stuck on something. :)

Ron
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.16 (GNU/Linux)

iEYEARECAAYFAky6jTIACgkQ2t2zxlt4g/QFZgCgshCqaa9QnzKoxoLDBBTE1Ftf
qgYAoMBImVn8Cg5JNR6QVTBy5v1i6XLX
=TKOA
-----END PGP SIGNATURE-----
<http-enum-changes.patch>_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/


Nice! I actually started looking in to this yesterday as well but ran into a problem.
While Nikto takes a more vulnerability/discover-oriented approach to detecting applications/urls I wanted to take a 
more fingerprint oriented approach.
I'm not sure if my fingerprint oriented approach is inline with the current goal of http-enum?

Basically I want to be able to do:
/admin - Bubba|2 NAS administration web page
/webmail - Squirrelmail v1.2.3
/webmail - Outlook Web Access v2.3.4
/webmail - GroupWise Web Access v1.2.3
/wp - WordPress v1.2.3
/wordpress - Wordpress v1.2.3

Rather then:
/admin - Admin Directory
/web - Potentially interesting folder

Anyway, In order to be able to do fingerprinting I came to the conclusion that I wanted to split the probe and match 
parts to resemble the service/version scan.
Splitting the two makes it possible to run a probe for eg. /admin and then have several match lines determining what 
application is actually behind that url.
While the current design does allow this it involves doing a new request for each match which will eventually become a 
problem as the database grows.

//Patrik

--
Patrik Karlsson
http://www.cqure.net
http://www.twitter.com/nevdull77





_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/


Current thread: