Nmap Development mailing list archives

Re: [Zenmap] Slow "Open" for large files


From: Daniel Miller <bonsaiviking () gmail com>
Date: Sun, 8 Jun 2014 11:30:58 -0500

Jay,

Thanks for following up on this! This patch is exactly what I was looking
for. It passes my checks; please commit it at your earliest convenience.

Dan


On Sun, Jun 8, 2014 at 7:58 AM, Jay Bosamiya <jaybosamiya () gmail com> wrote:

Hi All!

While I was testing things for my other patch (the MemoryError one), I
came upon this rather strange thing. A scan that I had run took 30
seconds to run, about a second to save, but hours to load. The "Open"
option was too slow for large files!

Some analysis showed that the Open option was using string concatenation
and this made the algorithm quadratic in time complexity. (Thanks Dan
for pointing out that string concatenation is slow in Python).

Modifying this to use StringIO immediately improves the speed. The same
file opened in under half a minute!!!

Attached is the patch for the same. I had to refactor some other parts
of the code for zenmap/zenmapCore/NmapParser.py too (since _nmap_output
is used in some places where nmap_output should be used).

I have tested this patch and there seems to be no problem.

Feedback is welcome.

Cheers,
Jay

PS: The command I used to generate the large file is "nmap
--packet-trace -r --top-ports 1000 127.0.0.1/24". The generated xml file
(using "Save" from Zenmap) is about 20 MB in size!

_______________________________________________
Sent through the dev mailing list
http://nmap.org/mailman/listinfo/dev
Archived at http://seclists.org/nmap-dev/

_______________________________________________
Sent through the dev mailing list
http://nmap.org/mailman/listinfo/dev
Archived at http://seclists.org/nmap-dev/


Current thread: