Nmap Development mailing list archives

Re: [Bug]? -iR <num_hosts> on windows XP generates duplicate targets


From: Fyodor <fyodor () insecure org>
Date: Wed, 23 Apr 2008 20:56:13 -0700

On Thu, Apr 24, 2008 at 03:38:16AM +0000, Brandon Enright wrote:

Hmm, I don't think we're out of the woods yet.  I think we're hitting
a very short cycle problem of Visual Studio's rand LCG that won't show
up using GCC on Linux.

We may need to use rand_s() on Windows.

I just sent a follow up note to your previous one.  I don't have a
Windows dev box or I'd help test this.

Well, I tested the patch I applied on Windows and I get:

$ ./nmap -n -sL -iR 500000 | egrep '^Host' | sort | uniq | wc
 499968 1999872 15684637

So 499,968 uniques out of 500,000.  That's pretty reasonable.  But if
anyone can find a problem with the patch or a better solution, I'm all
ears.

BTW, I compiled with Visual Studio on Windows XP SP2 and am running
the test from Cygwin.

It may be OK that windows RAND_MAX is 32K (15 bits), because we only
use 16 bits per call anyway:

  for(i=0; i < sizeof(bytebuf) / sizeof(short); i++) {
        iptr = (short *) ((char *)bytebuf + i * sizeof(short));
        *iptr = rand();
      }

Maybe we should only be doing one byte at a time, since the high bit
of every 2nd byte we generate may always be zero on Windows.  Anyone
want to test this and make a patch?  The patch could check RAND_MAX
and use that to decide the number of bytes to user per call.

Cheers,
-F

_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://SecLists.Org


Current thread: