WebApp Sec mailing list archives

Re: OS XSS and SQL scanner


From: "Dean H. Saxe" <dean () fullfrontalnerdity com>
Date: Wed, 2 Aug 2006 10:14:16 -0400

Rory,

Great point. Good code reviewers/threat modelers are in short supply. However, I believe that the code reviews -- which take more time and cost more money initially -- can be done much earlier in the lifecycle on a regular basis. And this leads to finding and fixing those vulnerabilities earlier in the lifecycle at less cost (see Capers Jones 1996 for a breakdown of how much it costs to fix a bug at any point in the SDLC). I also believe that this can be addressed through developer training and changes to the SDLC which introduce security to the lifecycle (CLASP, McGraw's Touch Point model, etc.). This is a time consuming process, but having been in a situation where I had to fix flaws in almost 1 million LOC while the code base was still changing, I can tell you that without such changes this type of approach will never be successful. As I fixed 100 instances of XYZ vulnerability, 50 new instances were created elsewhere. I was chasing my tail and only seeing small improvements.

I disagree with you on whether they can and will effectively test every field for SQL Injection, XSS, etc. I haven't seen any evidence that they can do so with a reasonable rate of false negatives. I'd love to have a tool vendor prove me wrong on an application that they have never seen before.

-dhs

P.S.  I guess I'll have to come to the UK for a pint. ;-)

Dean H. Saxe, CISSP, CEH
dean () fullfrontalnerdity com
"What difference does it make to the dead, the orphans, and the homeless, whether the mad destruction is wrought under the name of totalitarianism or the holy name of liberty and democracy? "
    --Gandhi


On Aug 2, 2006, at 4:58 AM, Rory McCune wrote:

My 2p (UK) on this.

I'd agree with you in saying that the best results come from Code Review/Threat Model or Manual Pen test, but you hit the nail on the head with the phrase "talented testers/reviewers" who are unfortunately, in my experience, in limited supply. Also the cost implications can rule out that type of testing for applications, so a lot of the time you need to make the best of the limited time you've got.

In my opinion Web applciation scanners function best as an adjunct to manual testing, not as a replacement for it. My experience is that they are useful for running very large numbers of tests to get coverage for things like SQL injection and XSS on every field in an application but they don't find things like logic errors and authorization problems reliably, which isn't too surprising as those items tend to be application specific and therefore extremely hard to code a generic test for.

The other point I'd make is that I think that the current generation of web application scanners are best used by experienced web app. testers as opposed to being used by developers/non- specialist security-types as they need a fair amount of tweaking to get the best out of (although it's worth noting that when I reviewed the apps. earlier in the year it seemed that appscan was moving in that direction, will be interesting to see how they go)

Unfortunately I'm not at BlackHat :O(

cheers

Rory

On 8/2/06, Dean H. Saxe <dean () fullfrontalnerdity com > wrote:Here, here, Arian.

Let's see the web app scanner folks go up against a manual pen test
and code review/threat model on a series of apps.  One caveat: the
results must be open for review, which means publishing the results
in an open forum for all to see.

FWIW, I'm a former customer of SPIDynamics.  I have experience web
app scanners in an enterprise environment along with pen testing and
code reviews.  I have a good idea how things will shake out:  Web app
scanners are inexpensive to run but don't find significant numbers of
vulnerabilities.  Pen tests are a decent measure of security at a
reasonable cost when performed my talented testers.  Code review &&
threat model finds the most vulnerabilities at the highest cost when
performed by talented reviewers.

Will any web app scanner companies actually subject their scanners to
such a bake off?  If not, how can we trust the marketing material?
Was Gary McGraw right in calling these tools "badnessometers"?

I'm at BlackHat all week.  Email me and we'll get together and chat.
I'll be attending the WASC gathering at Shadow Bar tomorrow night.  I
hope to see some of you there.

-dhs

Dean H. Saxe, CISSP,  CEH
dean () fullfrontalnerdity com
"[T]he people can always be brought to the bidding of the leaders.
This is easy. All you have to do is to tell them they are being
attacked, and denounce the pacifists for lack of patriotism and
exposing the country to danger. It works the same in every country."
--Hermann Goering, Hitler's Reich-Marshall at the Nuremberg Trials


On Aug 1, 2006, at 2:35 PM, Arian J. Evans wrote:

>
>
>> -----Original Message-----
>> From: Mandeep Khera [mailto:mandeep () cenzic com]
>>
>> I am sorry to hear that you perceive some problems with our
>> product. We take pride in being the most accurate product
>> with least amount of false positives in the industry. This
>> has been proven in many bake-offs by customers and
>> independent journalists.
>
> Hate to take this a little off topic, but do you have any facts
> that can support or back up these claims? Any data produced by
> anyone competent that speaks to your "false positives" and also
> your "false negatives"?
>
> I have failed to read a review yet to date that contains useful
> information. So far what I've read varies from useless data
> organized around features like "reflective buttons" ( e.g.-the
> Acunetix review posted to this list written by some woman
> who writes windows software articles) to the other extreme
> of uninformed opinion and inability to keep features between
> the products straight (secure enterprise computing review).
> This includes infosec magazine and online reviews, bake-offs,
> and Gartner-style evals. Every one I have read so far is garbage.
>
> Not one covers actual tests run & and the how & why around them.
>
> This situation is no doubt due to the utter lack of skill
> and understanding of the subject on the part of the authors.
>
> However, I think all on this list would welcome information
> of a high-quality nature regarding scanner quality, if you
> have anything like that to point us at.
>
> -ae
>
>
>
>
>
> ----------------------------------------------------------------------
> ---
> Sponsored by: Watchfire
>
> Do you test web applications for XSS, SQL Injections, Buffer
> Overflows,
> Logical issues and other web application security threats? Why not
> automate this work with Watchfire's AppScan, the world's leading
> automated web application scanner. Download AppScan today!
>
> https://www.watchfire.com/securearea/appscancamp.aspx?
> id=701300000008BP9
> ----------------------------------------------------------------------
> ----
>


---------------------------------------------------------------------- ---
Sponsored by: Watchfire

Do you test web applications for XSS, SQL Injections, Buffer Overflows,
Logical issues and other web application security threats? Why not
automate this work with Watchfire's AppScan, the world's leading
automated web application scanner. Download AppScan today!

https://www.watchfire.com/securearea/appscancamp.aspx? id=701300000008BP9 ---------------------------------------------------------------------- ----




-------------------------------------------------------------------------
Sponsored by: Watchfire

Do you test web applications for XSS, SQL Injections, Buffer Overflows, Logical issues and other web application security threats? Why not automate this work with Watchfire's AppScan, the world's leading automated web application scanner. Download AppScan today!

https://www.watchfire.com/securearea/appscancamp.aspx?id=701300000008BP9
--------------------------------------------------------------------------


Current thread: