Nmap Development mailing list archives

Re: scanning a set of websites to find /cgi-bin/nut/upsstats.cgi and /cgi-bin/apcupsd/multimon.cgi UPS information.


From: Daniel Miller <bonsaiviking () gmail com>
Date: Thu, 26 May 2016 09:02:08 -0500

There are several ways to accomplish this, depending on what your goals are.

1. If you want to detect the presence of these files in addition to many
others in order to fingerprint the web application, you can add
fingerprints to nselib/data/http-fingerprints.lua as described in the
comments in that file. Then you can use http-enum.nse to scan for them.
https://nmap.org/nsedoc/scripts/http-enum.html

2. If you want to detect the presence of one of these files only, you can
use the http-title.nse script and pass the http-title.url script argument
to specify the path to fetch.
https://nmap.org/nsedoc/scripts/http-title.html

3. If you want to download any of these files if they exist, use the
http-fetch.nse script with the http-fetch.paths script argument. The pages
will be downloaded and saved to disk.
https://nmap.org/nsedoc/scripts/http-fetch.html

There are other options, too, if you peruse the documentation for our many
http-* scripts.

Dan

On Wed, May 25, 2016 at 5:34 PM, Abhinav Bajaj <abhinav_bajaj2009 () yahoo com>
wrote:

Hello,

Can you please tell me what changes I have to do inorder to get nmap to
scan for upstats.cgi or multimon.cgi pages under the cgi-bin only. I think
I'm supposed to change the http-fingerprints.lua file under the
nselib/data. Please help me out I hae been struggling on this for  long
time. Your prompt help is appreciated.

Abhinav


_______________________________________________
Sent through the dev mailing list
https://nmap.org/mailman/listinfo/dev
Archived at http://seclists.org/nmap-dev/

_______________________________________________
Sent through the dev mailing list
https://nmap.org/mailman/listinfo/dev
Archived at http://seclists.org/nmap-dev/

Current thread: