WebApp Sec mailing list archives

On Application Scanners (Was: Application Assessment)


From: "Mark Curphey" <mark () curphey com>
Date: Sun, 14 Aug 2005 09:41:35 -0400

An interesting thread in some ways. Reviews from the likes of Enterprise
Computing / eWeek are likely to be of little value (to me anyway) and you
should never evaluate security software by comparing features a vendors
claims to have on paper ;-( I have seen reviews of products (not scanners
specifically) that get 5 stars for effectiveness in the press and been sat
on reports paid for by the vendor that shows how totally ineffective they
are. The only test results that are worth while are ones that are;

-in the open
-can be reproduced and are repeatable
-conducted by competent people with no bias (i.e. don’t sell advertising)

Dinis Cruz and I have been shooting around a spec for a tool that people can
use to test the effectiveness of these types of tools on their own.
Basically it will generate a web site based on size (number of pages and
links), number, type and complexity of vulns, amount of complexity (number
of forms, JavaScript, quirks etc) so you can mimik a typical environment;
yours! Canned website evals are obviously of no value whatsoever and
personally I find it quite patronizing to the intelligence of the potential
buyer. If I were in the market for a tool or thinking about renewing my
license I would definitely wait until we release this. If all goes well we
plan to release the tool in time for the OWASP Conference in DC in October.
It will be free and probably open source.

Here is a blog posting I made a while back on the topic of web application
scanners. 

Here is the link to my original blog posting and the text below.

https://www.threatsandcountermeasures.com/blogs/marksblog/archive/2005/05/19
/382.aspx

____________________________________________________________________________
__________________________

Following the lead of Dan Geer who I respect enormously for speaking his
mind about things, here are a few bullet points to consider about web app
scanners in general;

What We Know About Testing 
Would You Drive a Car If It Was Front Impact Tested Only? 
Too Little Too Late 
Economics of Testing After Code Is Built
Signatures and Turing Machines 
Bugs vs. Flaws 
Developer Tools, Come On! 
Performance Sucks, But How Do You Prove It?


----------------------------------------------------------------------------
----


What We Know About Testing - surprisingly little work has been published
about security testing; that is to say little work of real interest. So
Would you Drive a Car If It Was Front Impact Tested Only? I doubt it! Denis
Verdon (CSO of a major financial services company) presented a very humorous
analogy once called “ if cars were built like applications......”. He poked
fun at the security community asking them what would happen if you got a
side impact, what would happen if you you were shunted from the rear
(database). How would your car react when you weld two cars together etc.
This is what penetration testing is, front impact testing only. Too Little,
Too Late  - At Foundstone we have had many clients who ask us to test after
they have built the code to find they made major architectural errors. With
deadlines and economics it is often too late to fix things. Testing after
the application is built is too little too late. 

The Economics of Testing After Code Is Built - If you look at the
development community there have been many studies done on how certain
techniques are effective and the cost implications of findings bugs at
certain stages in the SDLC. One great study was by Capers Jones in Applied
Software Mag in 1996 (yeah 1996). They concluded that if you found a bug
during development it cost an average of $25 to fix, if you found a bug post
release it cost $16,000. No rocket science there, IBM Systems Sciences
Institute found similar results. So why do people insist on testing after
code is built? Web application scanners are after the event, they test using
HTTP which means that the code must compile and run. This is too late!
Interesting enough Capers Jones also concluded that 85% of the bugs were
introduced during development time. 

Signatures and Turing Machines - the security community has long had a
fascination with signatures. Intrusion Detection Systems and Network
Scanners use them to find vulnerabilities and hackers. But the paradigm just
doesn't translate to bespoke applications (and I am sparing the reader a
discussion of indignant marketing BS produced by many vendors about learning
systems and generic signatures). All applications are different. When you
write a signature for a OS check you can predict with a good degree of
certainty where the issue will manifest itself and what you should do to
find it. With bespoke applications you can not. Think about a web
application and a string objects. The number of inputs is equal to the
number of outputs. Its vast, unfeasible to guess at. Its like finding a
digital needle in a haystack. 

Bugs vs. Flaws - in my last post I talked about bugs vs. flaws.
Architectural flaws like a developer using a bad crypto algorithm are almost
impossible to find in an automated fashion. If we accept these issues are
hard for humans to find, how can we even think an automated crude tool is
going to stand a hope?

Developer Tools, Come On!  - I am not sure who these marketing people are
trying to kid  but just because something plugs into a IDE doesn't make it a
developer tool. Developers write code, not play with HTTP stream. Too
little, too late. Maybe I can write a Visio plug-in and call it a Design
Edition ;-) 

Performance Sucks, But How Do You Prove It? - luckily this ones is easy.
What you don't do is point them at a a pre-canned site with pre-canned holes
that signatures will match. The outcome of this is obvious. What you don't
do is point them at a badly written site where you don't know the holes. If
you do you will not be able to determine the false positives and false
negatives. You have to use a site where you know the issues and can compare
what the tool did find and didn't. Introducing Hacme Books, a Foundstone
tool.  You will find that most of the scanners (and yes we have tested them)
find less than 10% of the issues in a normal web site. Don't believe me, try
it. You'll be amazed. The best found 15% and the worst found 3% last time I
tried. And the best found +300 false positives and the worst I had to stop
when it hit 50,000 false positives. 

----------------------------------------------------------------------------
----
OK so I would never buy a web app scanner right? Actually no, in context I
think they have a use. Low hanging fruit. But if I am serious about finding
bugs and flaws in systems then I wouldn't. If I did I would buy a copy of
SPI Dynamics which when I tested shone clearly above the rest but its all in
context of the above. I would NEVER buy from Watchfire / Sanctum who I think
are a highly unethical company on many fronts. Legal reasons prevent me from
exanding on that.


-----Original Message-----
From: Ory Segal [mailto:osegal () watchfire com] 
Sent: Saturday, August 13, 2005 11:03 AM
To: Kyle Starkey; RUI PEREIRA - WCG; jcreyes () etb net co
Cc: pen-test () securityfocus com; Webappsec
Subject: RE: RE: Application Assessment

Hello,

I would like to speak on behalf of my company (Watchfire). I wouldn't
usually address such a thread, but since the things that were mentioned were
basically incorrect, I thought it would be best to respond.

Watchfire has a very large team dedicated to AppScan's development. It is
the company's top priority.  We put out a significant new release last
September and the next one will be coming soon. 

You are invited to test the product(s) for yourself to decide which is best.


-Ory Segal/Watchfire


-----Original Message-----
From: Kyle Starkey [mailto:kstarkey () siegeworks com]
Sent: Friday, August 12, 2005 10:39 PM
To: RUI PEREIRA - WCG; jcreyes () etb net co
Cc: pen-test () securityfocus com; Webappsec
Subject: Re: RE: Application Assessment

I would suggest against the appscan product unless you want to use their
developers addition for pre compiled code... There has been very litle r&d
time/dollars being allocated to this product in the past 24 months and as
such it has lagged behind in functionaliy by comparison to the webinspect
product.. If you only have budget for one tool I would suggest webinspect
over the others...


On Fri, 12 Aug 2005 1:32 pm, RUI PEREIRA - WCG wrote:
Juan,

Approx 1 year ago we did an evaluation between Appscan, Kavado, 
WebInspect and AppDetective. We chose WebInspect for the range of 
vulnerabilities tested for, the granularity of test selection, the 
flexibility of use, etc. Contact me offline if you want more detail on 
our selection process.

Thank You

Rui Pereira,B.Sc.(Hons),CIPS ISP,CISSP,CISA Principal Consultant

WaveFront Consulting Group
Certified Information Systems Security Professionals

wavefront1 () shaw ca | 1 (604) 961-0701


----- Original Message -----
From: Juan Carlos Reyes Muñoz <jcreyes () etb net co>
Date: Friday, August 12, 2005 8:26 am
Subject: RE: Application Assessment

 -----BEGIN PGP SIGNED MESSAGE-----
 Hash: SHA256

 Allen,

 One question... have you ever tried Watchfire's Appscan? If so, 
which tool  could be better between Appscan and Webinspect?

 Juan Carlos Reyes Muñoz

 GIAC Certified Forensic Analyst - SANS Institute  Consultor de 
Seguridad Informática

 Cel. (57) 311 513 9280

 Miami Mailbox
 1900 N.W. 97th Avenue
 Suite No. 722-1971
 Miami, FL 33172

 Las opiniones expresadas en esta comunicación son enteramente 
personales. De  igual manera, esta comunicación y todos sus datos 
adjuntos son  confidenciales y exclusivamente para el destinatario.
Si por algún  motivorecibe esta comunicación y usted NO es el 
destinatario,  hágamelo saber  respondiendo a este correo y por favor 
destruya cualquier copia  del mismo y  de los datos adjuntos. Por 
favor tambien trate de olvidar  cualquier cosa que  haya leido en 
esta comunicación, excepto en esta parte. Está prohibido  cualquier 
uso inadecuado de esta información, así como la  generación de copias 
de este mensaje. Gracias.

 The contents and thoughts included in this e-mail are completely 
personal.This e-mail message and any attachments are confidential and 
may be  privileged. If you are not the intended recipient, please 
notify me  immediately by replying to this message and please destroy 
all  copies of  this message and attachments. Please also try to 
forget everything  you have  read that was contained in this E-Mail 
message, except this part.
 Misuse,copying and redistribution of this e-mail are forbidden.
 Thank you.

 > -----Mensaje original-----
 > De: Brokken, Allen P. [BrokkenA () missouri edu]  > Enviado el: 
Jueves, 11 de Agosto de 2005 01:43 p.m.
 > Para: Glyn Geoghegan; goenw
 > CC: pen-test () securityfocus com; Webappsec  > Asunto: RE: 
Application Assessment  >  > I am a Security Analyst for the 
University of Missouri -  Columbia Campus.
 > I came from a systems administration background, and in the past
 18 months
 > have been tasked with application security as just part of a 
greater  > Information Systems Auditing program.
 >
 > I personally have used
 >
 > SpikeProxy from www.insecure.org
 > Paros, mentioned by others
 > and evaluated a handful of other Proxy/Automated Attack Methods.
 >
 > However, the best tool I've seen and the one we finally  purchased 
is  > WebInspect from SPI Dynamics  > http://www.spidynamics.com  >
I did some independent test between SpikeProxy and WebInspect on
the a few  > different applications.  With SpikeProxy it took 
basically 1  working day  > to run the tool, and verify false 
positives, look up good  references for  > the vulnerabilities and 
write the report.  The same application with  > WebInspect took 
approximately 15 minutes of my time to  configure, and  > generate 
the final report while taking about 2 hours to actually run  > 
without my intervention.  It typically found 20% more vulnerabilities 
than  > I could find by the more manual method with SpikeProxy, and 
produced  > extensive reports that not only explained the 
vulnerabilities,  but gave  > code references the developers could 
use to fix their problem.
 >
 > Those were results I got prior to training.  I got some  extensive 
training  > with the tool and on web application testing in general 
at  Security-PS  > http://www.securityps.com.  They are a 
Professional Application  Security> auditing company and they use 
this as their core tool  because of both the  > accuracy of the tool 
and the responsiveness of the company.  In the  > training I got to 
learn how to effectively use the a whole suite  of tools  > including 
a Web Brute force attacker, SQL Injector, Proxy,  Encoders /  > 
Decoders, and Web Service assessment tools to name a few.
 >
 > The tool is a little pricey, but I work with litterally dozens  of 
campus  > departments and have evaluated LAMP, JAVA/ORACLE, 
ASP.NET/SQL  Server and  > even VBScript/Access systems with the 
WebInspect Suite of tools.
 The #1
 > comment I get from the developers is how helpful the report was in
correcting their code. For that broad spectrum of coding
enviroments I  > couldn't possibly provide code level help to the 
developers  without this  > product.
 >
 > We've been using it now for almost a year and the responsiveness 
of their  > Sales and Technial staff has been extreme.  I haven't had 
a  single issue  > that wasn't resolved in less than 24 hours.  I've 
also gotten a  lot of  > support from their sales staff regarding 
application security  awareness> for our campus developers in 
general.
 >
 > One last thing to mention is the updates.  I have never seen a 
tool that  > is so consistently updated.  I have run 2 or 3 
assessments in  the same day  > and had updates for new 
vulnerabilities made available each time  I ran the  > tool.  If a 
week goes by without using it there can be  litterally 100's of  > 
new signatures it needs to add to the list.
 >
 > If you have more questions and want to talk offline I'd be happy 
to answer  > them.
 >
 > Allen Brokken
 > Systems Security Analyst - Principal  > Univeristy of Missouri  > 
brokkena () missouri edu


 -----BEGIN PGP SIGNATURE-----
 Version: PGP Desktop 9.0.1 (Build 2185)
 Comment: Mensaje Seguro, Enviado por Juan Carlos Reyes M.

 iQIVAwUBQvy/k4ElKqNdrUwNAQgxhw//c/aBxhmWEZl5lisTuM4YjV7VL5ikWCzr
 OwwfVoV+dnAzYSio55zhGidKLh/kU9A12WdWz6a77xSZyPmsf0mVszyN0cYuf24A
 /jtxb9GRAdlyLii1r38FdQ2BKCl3/Wydd2Q5seyukNZMg5QggdtSPMyKwF4pkehD
 7Z6Hb/M+bQjJN7zyn8L/94Kr0LJU8GK8AWCO4XB+yku5ndUOmcWF+XJrClx3qUSO
 FWj75d+fasRXuM8/Z9bBeCfvDlhuTh01afa68Mz2aO5uOoCooDvsAa0S9q6gre8e
 TDzl8okWMzudyKdJrbkW5JPb3SGvtAvcsfdRKX+qv4dbhxFnbKncghhwMgBY+2ua
 uZ8nieMtvjTbpPNev0VQe7nDCD0XPR6Ft9Ty1DddYY9SbIOoJAYR0oQ50zBi769i
 Eq0CD8++Hf4oqrBHZEkIMsotNYVTEjOcdbiP9lqd/efZ0Tcl5pZKP8qqGcUF1/D4
 OUpq4JEM/N3iw0dTBPLnvIcHftE6Ou/VJAr8EFjUAw++9LBcwXKd9U5q+1j2ysBo
 ELRd+wpTz5dTc73nQeTjA8MNJspO82JHf8C/c0f89OlKMgDx8fcnwcV+FL8L52Od
 /KITItOoltULIhvFoHHWK23mWibJffu4XMN00YAwTzlC09iQMUZisdX+Jju6gsz5
 Eyk0+jWqQCg=
 =L/PW
 -----END PGP SIGNATURE-----



----------------------------------------------------------------------
-------- FREE WHITE PAPER - Wireless LAN Security: What Hackers Know 
That You Don't

Learn the hacker's secrets that compromise wireless LANs. Secure your 
WLAN by understanding these threats, available hacking tools and 
proven countermeasures. Defend your WLAN against man-in-the-Middle 
attacks and session hijacking, denial-of-service, rogue access points, 
identity thefts and MAC spoofing. Request your complimentary white paper
at:

http://www.securityfocus.com/sponsor/AirDefense_pen-test_050801
----------------------------------------------------------------------
---------
Kyle Starkey
Senior Security Consultant
SiegeWorks
Cell: 435-962-8986


Current thread: