Secure Coding mailing list archives

Dark Reading - Desktop Security - Here Comes the (Web)Fuzz - Security News Analysis


From: gem at cigital.com (Gary McGraw)
Date: Tue, 27 Feb 2007 06:22:34 -0500

Just for the record, the testing literature (non-security) supports ken's point of view.  Possibly the most amusing 
thing about all of this discussion about black box versus white box is that this is only one of many many divisions in 
testing.  Others include partition testing, fault injection, and mutation testing.  We really have a long way to go 
with security testing to catch up.

gem

company www.cigital.com
podcast www.cigital.com/silverbullet
book www.swsec.com


 -----Original Message-----
From:   Kenneth Van Wyk [mailto:ken at krvw.com]
Sent:   Tue Feb 27 04:07:07 2007
To:     Secure Coding
Subject:        Re: [SC-L] Dark Reading - Desktop Security - Here Comes the (Web)Fuzz - Security News Analysis

On Feb 27, 2007, at 3:33 AM, Steven M. Christey wrote:
Given the complex manipulations that can work in XSS attacks (see  
RSnake's
cheat sheet) as well as directory traversal, combined with the sheer
number of potential inputs in web applications, multipied by all the
variations in encodings, I wouldn't be surprised if they were  
effective in
finding those kinds of implementation bugs, even in well-designed
software.  Although successfully diagnosing some XSS without live
verification smells like a hard problem akin to the Ptacek/Newsham
"vantage point" issues in IDS.

With the track record of non-web fuzzers and PROTOS style test  
suites, why
do you think web app fuzzing is less likely to succeed?

It's not so much that I don't think fuzzing is useful, it's that I  
don't see "one size fits all" fuzzing _products_ being useful.

To me, it gets to an issue of informed vs. uninformed (or "white box"  
vs. "black box" if you prefer) testing.  While they're both useful  
and should both be exercised, I believe (though I have no hard  
statistics to validate) that issues of coverage/state are always  
going to doom uninformed testing to being less effective than  
informed testing.  For a fuzzer to be really meaningful, I believe  
that a "smart fuzzing" approach is going to be the best bet, and that  
makes it hard for a "one size fits all" product solution to be feasible.

To do smart fuzzing, a lot of setup time is necessary in establishing  
an appropriate test harness and cases that fully exercise the files,  
network interface data, user data, etc., that the software is expecting.

Perhaps I'm totally off base, and I invite any product folks here to  
chime in and correct my misconceptions.

Cheers,

Ken
-----
Kenneth R. van Wyk
SC-L Moderator
KRvW Associates, LLC
http://www.KRvW.com








----------------------------------------------------------------------------
This electronic message transmission contains information that may be
confidential or privileged.  The information contained herein is intended
solely for the recipient and use by any other party is not authorized.  If
you are not the intended recipient (or otherwise authorized to receive this
message by the intended recipient), any disclosure, copying, distribution or
use of the contents of the information is prohibited.  If you have received
this electronic message transmission in error, please contact the sender by
reply email and delete all copies of this message.  Cigital, Inc. accepts no
responsibility for any loss or damage resulting directly or indirectly from
the use of this email or its contents.
Thank You.
----------------------------------------------------------------------------



Current thread: