Nmap Development mailing list archives

Re: [NSE] Script Dependencies Replacement for Runlevels


From: Ron <ron () skullsecurity net>
Date: Tue, 10 Nov 2009 23:24:37 -0600

Fyodor wrote:
On Tue, Nov 10, 2009 at 09:25:16AM -0600, Ron wrote:
I'm wondering if a library would be better for this sort of need than
"strong dependencies"?  Let's look at these two options in the case of
http-spider:

==As a spidering library:

 o Your script which depends on spidering can pass an argument saying
   what type of content you're interested in (e.g. do you care about
   images at all?  If not, no point in a script like sql-injection
   downloading them)

 o You can pass arguments specifying how deeply you want to spider
   (e.g. stop after 1,000 requests or 5 minutes, whichever comes first)

 o Depending on how the spidering library is architected, you may be
   able to control it interactively (e.g. have it call you back with each
   resource so you can deal with it one page at a time and perhaps
   specify which embedded links you want to (or do not want to) spider
   recursively.

 o The library doesn't request anything if none of the scripts end up
   needing it.  And even if it is needed, it doesn't end up running
   until that point where it will need to be used (because it gets
   called then).

 o The same script or other scripts can call it later if more
   information is needed.  For example, sql-injection may not care
   about images, but http-mirror might.  So http-mirror will call it
   with different arguments and the pages will come from our cache
   while the images will be fetched for the first time.

 o I'm assuming that the worker processes (Patrick's other neat
   pending patch) could be used by a library just as they can by a
   script.

==As a script

 o I guess the script would just have to do something generic like
   fetch a bunch of pages (including images, I suppose), and then stop
   at some point and hope it has gathered whatever the scripts which
   depend on it need.  It can see user-specified NSE arguments, just
   as a library could, but doesn't know anything about the
   requirements of the scripts which depend on it.

For these reasons, it seems to me that a spidering NSE library would
be better than a script, even if we had strong dependencies.

Cheers,
Fyodor

Although I agree that implementing strong dependencies as libraries is
possible, I think that 'strong dependencies' (or whatever you want to
call them) is a better solution in terms of code complexity. I can see
having a function that downloads certain pages depending on the
arguments, and blocking if other scripts are using it, getting
complicated fast. Having 'worker' scripts that get the information
beforehand seems conceptually nicer to me.

By your logic, it would make sense to eliminate all dependencies (weak
and strong) in favour of libraries. Is there a difference between a weak
and strong dependency that makes strong dependencies a better choices
for libraries and weak dependencies better for other scripts?

Also, having both weak and strong dependencies also gives a nice sense
of symmetry to me. I like symmetry. :)

Ron

-- 
Ron Bowes
http://www.skullsecurity.org/
_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/


Current thread: