Penetration Testing mailing list archives
Re: Web Application Scanners Comparison
From: Andre Gironda <andreg () gmail com>
Date: Tue, 27 Jan 2009 21:29:10 -0700
On Tue, Jan 27, 2009 at 12:32 PM, anantasec <anantasec () googlemail com> wrote:
- What policies did you use for the tools? Did you create them? - Any specific tuning?I used the default policies for all the tools. I didn't made any kind of tuning for any of them. Most of the users of these tools are just using the default settings. They don't have enough knowledge to configure the tools.
The vendors recommend configuration/tuning and it seems pretty appropriate given the nature of the tools.
- What about the application coverage (not only links)? Maybe a tool didn't find a >vulnerability because it didn't cover this part of the application. Should it then get -5, since >it's a crawler problem?Yes, it should get a -5 if it didn't found a valid vulnerability. I don't think it's important why it didn't found a vulnerability.
Most people do care.
If a tool don't cover a part of the application and generates a false-negative, I don't think it >should count as much as if it cover the application and also generates a false-negative: >since you focus on rating the vulnerability finding, you have no idea what you are scoring >here -- the badness of the crawler/parser or the badness of the attack engine.
I'm going to have to agree with Romain, especially on this point. Look, the basic premise is that web application security scanners work differently in different hands. If you know what a breadth vs. depth search is... and know other tunables, then there is a totally different result. There is no comparatives for web application security scanners still. Web application security scanners are relatively useless in non-expert hands. A seriously old-school, 5+ year experience person is required to run these tools to get any value outside of awareness. The purpose of running such a tool should be to get root-cause, which works best when source-code assisted with an advanced tool such as Dinis Cruz's O2 and realizing where O2 missed certain software weaknesses in order to hone into those specific areas with a functional fault-injection tool such as a web application security scanner and possibly a few semi-manual methods using tools like Burp Suite, flasm/flare/swfintruder, Firebug/Firecookie, and/or Sahi along with passive tools such as ProxMon, Pantera, ratproxy, Casaba Passive Web Security Auditor, and Skavenger. A lot of this interaction is really application-specific, such as if Flash, Ajax, and other RIA or Widget technologies are in use, in addition to framework/language-specific. The industry has decided that neither VAPT, WAF, nor SAST are usable by themselves alone as a path to application security. Put the right tools in the right human hands, centralize, self-servize, and un-silo your respective in-house expertise. Anything else is uncivilized. Cheers, Andre
Current thread:
- Web Application Scanners Comparison anantasec (Jan 27)
- Message not available
- Re: Web Application Scanners Comparison anantasec (Jan 27)
- Re: Web Application Scanners Comparison Andre Gironda (Jan 28)
- Re: Web Application Scanners Comparison anantasec (Jan 28)
- Re: Web Application Scanners Comparison anantasec (Jan 27)
- Message not available
- Re: Web Application Scanners Comparison love.wadhwa () naukri com (Jan 28)
- Re: Web Application Scanners Comparison anantasec (Jan 28)
- Re: Web Application Scanners Comparison Dotzero (Jan 28)
- Re: Web Application Scanners Comparison anantasec (Jan 28)
- Re: Web Application Scanners Comparison Roman Medina-Heigl Hernandez (Jan 28)
- Re: Web Application Scanners Comparison anantasec (Jan 28)
- Message not available
- Re: Web Application Scanners Comparison anantasec (Jan 28)
- Re: Web Application Scanners Comparison Derek Fountain (Jan 28)
- Re: Web Application Scanners Comparison Adriel T. Desautels (Jan 28)