Nmap Development mailing list archives

Re: [NSE script] vhosts on the same ip


From: "sara fink" <sara.fink () gmail com>
Date: Mon, 25 Aug 2008 17:13:10 +0300

I tried to use it, but I get wrong results. In which directory nse_sedusa
should be?
This is what I get:


nmap -sP --script vhosts insecure.org

Starting Nmap 4.68 ( http://nmap.org ) at 2008-08-25 17:05 IDT
Host insecure.org (64.13.134.49) appears to be up.
Nmap done: 1 IP address (1 host up) scanned in 1.075 seconds





On Mon, Aug 25, 2008 at 2:31 PM, Sven Klemm <sven () c3d2 de> wrote:

Hi,

I've written a NSE script that queries search.live.com for host names
using the same IP. The script requires the changes in my nse_sedusa branch
(svn://svn.insecure.org/nmap-exp/sven/nse_sedusa).

I don't like the fact that it uses an external search engine to get this
information but I think the usefulness of the information outweighs this.
I am open to hearing about better ideas to implement this or for further
sources to get lists of vhosts from.

Example output:


./nmap -sP --script vhosts insecure.org

Starting Nmap 4.68 ( http://nmap.org ) at 2008-08-25 13:27 CEST
Host insecure.org (64.13.134.49) appears to be up.

Host script results:
|_ vhosts: cgi.insecure.org, download.insecure.org, images.insecure.org,
insecure.com, insecure.org, www.insecure.com, www.insecure.org

Nmap done: 1 IP address (1 host up) scanned in 0.81 seconds


Cheers,
Sven


--
Sven Klemm
http://cthulhu.c3d2.de/~sven/ <http://cthulhu.c3d2.de/%7Esven/>


---
-- Tries to find vhosts by querying search.live.com for other hosts on the
same IP
--
--@output
-- |_ vhosts: cgi.insecure.org, download.insecure.org, images.insecure.org,
insecure.com, insecure.org, www.insecure.com, www.insecure.org

require "sedusa"
require "ipOps"

id = "vhosts"
description = "Tries to find vhosts by querying search.live.com for other
hosts on the same IP"
author = "Sven Klemm <sven () c3d2 de>"
license = "Same as Nmap--See http://nmap.org/book/man-legal.html";
categories = {"intrusive","discovery"}


hostrule = function( host )
 return not ipOps.isPrivate( host.ip ) and ipOps.get_parts_as_number(
host.ip ) ~= 127
end

--- extract host names from search result page
--@return table with names of the hosts
local extract_hosts = function( document )
 local _,results,vhosts,host
 vhosts = {}

 results =
document.xml:find_all('//div[@id="results"]/ul[@class="sb_results"]/li/ul[@class="sb_meta"]/li/cite')
 for _,host in pairs( results ) do
   host = host:gsub("https?://",""):gsub("/.*","")
   table.insert( vhosts, host )
 end

 return vhosts
end

--- add table of hosts to vhosts table ignoring duplicates
--@param vhosts vhosts table
--@param hosts table of hosts to be added to vhosts
local add_hosts = function( vhosts, hosts )
 local _,host
 for _,host in pairs( hosts ) do
   if not vhosts[host] then
     vhosts[host] = host
     table.insert( vhosts, host )
   end
 end
end

action = function(host, port)
 local _,doc,vhosts,pages
 vhosts = {}

 doc = sedusa.http_get( 'http://search.live.com/results.aspx?go=&q=ip%3A&apos;
.. host.ip )

 -- the result section is not empty
 if doc.xml:find('//div[@id="results"]/ul[@class="sb_results"]') then

   add_hosts( vhosts, extract_hosts( doc ))

   -- look whether there are more result pages
   pages =
doc.xml:find_all('//div[@id="results_area"]/div[@class="sb_pag"]/ul/li/a[not(@class="sb_pagN")]/@href')

   local counter
   -- fetch further result pages
   for counter,url in pairs( pages ) do
     if counter > 3 then break end
     doc = sedusa.http_get( 'http://search.live.com&apos; .. url )
     add_hosts( vhosts, extract_hosts( doc ))
   end

 end

 if #vhosts > 0 then
   table.sort( vhosts )
   return table.concat( vhosts, ', ')
 end
end



_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://SecLists.Org


_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://SecLists.Org


Current thread: