Nmap Development mailing list archives

Re: Web App Scanner - GSoC 2009


From: João <3rd.box () gmail com>
Date: Sat, 28 Mar 2009 16:04:17 -0300

On Sat, Mar 28, 2009 at 11:31 AM, Rob Nicholls
<robert () everythingeverything co uk> wrote:
This sounds like it would make a couple good NSE scripts

Thanks for the suggestion Patrick,

I've attached a script of mine that only checks a few files and folders at
the moment and would need to be expanded over time (especially for more
advanced checks when the server returns 200 for non-existent files -
although that might just be a "nice to have" - and identifying folders
that allow directory listings), but I thought I'd share it and perhaps
others will have some suggestions/improvements in mind.

Here's some sample output:

Interesting ports on aurora.apache.org (192.87.106.226):
PORT   STATE SERVICE
80/tcp open  http
|  http-enum: /dev/ Possible development directory
|  /icons/ Icons directory
|  /images/ Images directory
|_ /mail/ Mail directory

Interesting ports on scanme.nmap.org (64.13.134.52):
PORT   STATE SERVICE
80/tcp open  http
|_ http-enum: /icons/ Icons directory

Interesting ports on wwwtk2test2.microsoft.com (207.46.193.254):
PORT   STATE SERVICE
80/tcp open  http
|  http-enum: /beta/ Beta directory (Access Forbidden)
|  /data/ Data directory (Access Forbidden)
|_ /test/ Test directory (Access Forbidden)

Interesting ports on xxx.xxx.xxx.xxx (xxx.xxx.xxx.xxx):
PORT    STATE SERVICE
443/tcp open  https
|_ http-enum: /webmail/ Webmail directory

I've used quite a few web application tools over the last few years and
I'm not sure that a command line tool like Nmap would be the best place to
add some of the suggested functionality.

Tools like Nikto certainly have their use, and perhaps we can produce a
similar NSE script so Nmap can match most of the functionality (and
possibly licence CIRT's database?); but for "serious" web application
testing I'd personally want a dedicated tool that lets me do things like
script logins, complete forms interactively in a browser, can follow
JavaScript links, test sites that use multiple servers/subdomains, lets me
manually crawl a website in a browser, doesn't produce too many false
positives, and (most importantly) lets me see the request and response
when I'm writing a report.

For sure you are thinking on a much bigger application than the one
that was related. Anyway, I think it is a very good feedback, since
there is no use for a "not-serious" web app testing tool. Thanks for
all the suggestions.

I'm quite new to NSE (since I started coding on it yesterday) and I'm
not used to all its functions. I think that for developing some of the
functionalities I'll need to write some modules, what would enlarge
the NSE API (I see it as a good thing).

Maybe having some visual information could be nice (like graphs to
show crawling results or the web app mapping). If so, possibly NSE can
become a limitation to the tool.

The great problem here is that, for a GSoC project, I need to write
everything in 3 months. For this reason I must think about a tool that
can be developed in time. Of course that I think about improving it in
the future, since it is a tool that I'm going to use (and a lot of
people too, I hope!). What can be done is trying to make a project
that can be easily extended after GSoC.


Being able to compare the results against some form of vulnerability
database sounds good, but medium-large sites often return several
gigabytes worth of traffic and this might not be healthy for Nmap (or the
host) to hold in memory and could be tricky to store into XML files etc.
(other tools I've used will typically store data in some form of SQL
databases).

Yes, I'm sure that holding gigabytes in memory is no healthy. But I
don't see it being a problem only for the comparison with the vuln
databases. Maybe the best idea is keeping the results on a file (XML
sounds good enough). There is also the possibility to create a simple
GUI that could handle this file and create reports, maybe in a fancy
html format, or something. Using databases create a dependency that
sometimes is not available.

Maybe we can create some options of using both databases and files in
the future. Having results on a database can be very useful if you
want to make relations between your tests... I mean, after testing a
lot of websites, you can start extracting statistical information
about your hosts. This information can be used to improve the tool,
and to learn (using machine learning, of course) how some applications
use to be organized. I'm pretty sure that such magic would be much
harder to write if you don't have your results in a database. Anyway,
this feature is something for the very far future (since we don't even
have the scanner yet!).

A separate tool, similar to how Zenmap/Ncat has been developed by Nmap
developers, might be better so results can be managed with a GUI and data
stored in a database, rather than trying to extend Nmap.

The idea of using NSE sounds me good since NSE API already have some
functionalities that could be very useful for developing the scanner,
and also writting script is a much easier task, what could lead to
easy modification of the code in the future. Anyway, I think that the
most important is getting good performance and results... And I'm not
sure which option could be better.

I'm not sure about all the drawbacks, but maybe we can have a
collection of NSE scripts to perform different tasks, and we could
create a GUI in another language (maybe python?) that could be used to
trigger the scripts. This GUI would also read the results and place
them on the final report file. Since some tests depend on the previous
results, the gui can also deal with reading and parsing the results to
create inputs for the next tests. I think it must be much easier to
perform with python than with NSE.

I'm sure about the python's dependency, and I'm sure that maybe it is
not the best option here. I really could use some suggestions here =].

Thanks for all the feedback and for the NSE code.

João

Rob

_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://SecLists.Org


_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://SecLists.Org


Current thread: