Nmap Development mailing list archives

Re: DNS cache snooping script


From: Eugene Alexeev <eugene.alexeev () gmail com>
Date: Sat, 15 May 2010 07:42:03 -0600

David,

I agree with you.  I'm also thinking of including the option of reading the
site list over HTTP.  It would be limited to consuming one line at a time,
but would let the user leverage sites like the zeus tracker.  How do you
want to go about creating the site list to be distributed with the script?

Eugene


On Fri, May 14, 2010 at 10:08 PM, David Fifield <david () bamsoftware com>wrote:

On Mon, Apr 12, 2010 at 10:50:29AM -0600, Eugene Alexeev wrote:
Ron,

Thanks!  I'm glad you like the script.  I based the default list on the
the
thought: "If I was doing a pen test against someone, what would I like to
know?".  The answer I came up with for this question was:
1.  If I have to do social engineering, where can I find the employees?
2.  What operating systems are present in the environment, even if they
are
not visible during the nmap scan?
3.  [How] do they patch and how frequently?
4.  How would I detect that the DNS server is lying to me?

So based on that, I stuck in:  common social sites, email login pages,
operating system pages, some update pages, and some obscure pages (like
kernel.org and insecure.org).  I've also built in the ability for the
user
to supply their own queries via --script-args
snoop_hosts={host0\,host1\,hostN}.

I tried this against my ISP web server:

53/udp open  domain
| dns-cache-snoop: DNS cache contains:
| -->   mail.google.com
| -->   hotmail.com
| -->   login.live.com
| -->   www.facebook.com
| -->   facebook.com
| -->   twitter.com
| -->   www.linkedin.com
| -->   www.myspace.com
| -->   myspace.com
| -->   www.flickr.com
| -->   www.youtube.com
| -->   digg.com
| -->   update.microsoft.com
| -->   www.microsoft.com
| -->   www.openbsd.org
| -->   www.sun.com
|_-->   nmap.org

I then ran with snoop_mode=timed, getting the same results, then ran
once more and saw that cache was polluted and all the names were cached,
so that all works as expected.

I'm just about ready to commit this script. I just think we need to put
some more thought into the domain list. This made me think of the CSS
exploits where a web server can tell what other sites you've visited.
The BeEF tool (http://www.bindshell.net/tools/beef/) has a mode that
does this. The file modules/network/detect_visited_urls/index.php in
their distribution has a list beginning

yahoo.com
google.com
youtube.com
live.com
msn.com
myspace.com
wikipedia.org

The alexa.txt file in the same directory identifies it as "Top 500 sites
from Alexa (2006-04-21)." Some other text files in the directory and
their top few sites:

sites.txt:
adwords.google.com
blogger.com
care.com
careerbuilder.com
ecademy.com
facebook.com
gather.com
gmail.com

social.txt:
www.twitter.com
twitter.com
www.myspace.com
myspace.com
www.facebook.com
facebook.com
www.slashdot.org
slashdot.org

I don't know that there's much value in looking for the top sites--it's
no suprise to find one of them in a DNS cache. On the other hand, we
have to have some kind of default list. I think your reasoning is sound,
that we want to include some operating system–specific sites and update
pages. So I suggest we take the Alexa top 100 or so, and combine it with
curated sites that Nmap users suggest.

The script argument should allow reading the site list from a file
instead of listing it in full on the command line.

David Fifield

_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/


Current thread: