Dailydave mailing list archives

Re: Source Code Analysis


From: Matt <matt () use net>
Date: Thu, 7 Sep 2006 13:22:18 -0700 (PDT)

On Thu, 7 Sep 2006, Dave Aitel wrote:

Source code analysis tools and products have traditionally struggled
with static analysis. Their market share doesn't make it look like
they're struggling, but in my opinion, the technology is not there
yet. However, they do occasionally come up with interesting results.
Oddly, the things they are worst at (buffer overflows) are the
old-school things that are mostly gone anyways.

It's intereting you say there is a big market -- I don't think there is
one for code analysis alone. My vague guess is there's maybe $20 million a
year, sustainably. Cut that between 4-5 companies, and there's just not
enough for anyone to bother VC funding a company based on that alone.

You'd have to a really compelling SUITE lined up, with each product in the
sutie providing value on its own and making real-world sense in the
context of the other products in the suite. I don't see anything like this
from the currents vendors that isn't vapor.

BTW, PC-Lint can very accurately detect out-of-bounds memory access, and
has been able to for quite some time. It's also a console app that is $239
and runs great under wine on Linux (and presumably MacOS on x86). They are
real nice folks and are pretty good about fixing the bugs I've reported to
them over the years. Alas, it doesn't do taint tracking and it does
require configuration. I've built up a good config file over the past 7
years if anyone wants to share.



I'm wondering what the results were for the Firefox scan announced
today. The product they used was this:
http://www.klocwork.com/forms/code_defect_scan.asp
http://www.g2zero.com/2006/09/examining_defects_in_the_firef.html

A former co-worker of mine showed me a neat demo of KLOCwork's product at
s-3con last february. From an analysis perspective it wasn't that
interesting, but as a manual reviewer the exploration tools were pretty
neat once you knew how to use them. It seemed like a good balance of the
two: automation and manual assistance. Most products are too unbalanced in
those regards and that's what it holding things back, IMO.


CoolQ gave a talk on his efforts regarding source code analysis via
gcc AST translation and state-table analysis at XCon 2006. I thought
it was well put together for people who are not completely wrapped in
static analysis to understand the basic concepts. I don't think his
paper is available publicly yet, but he found some bugs in the Linux
kernel with his tool relating to lock/unlock issues. His tool is also
not public, but the concepts don't seem that hard to implement for the
GCC team or someone familiar with the code-base.

Coverity did something like this initially, also.


A lot of the static analysis products are trying to be sold as a
service, which I find funny. It's very weird when people want to run
your code through their tool and won't let you use the tool yourself.
It's interesting also that there isn't a tool that is a guided hand
for the user, rather than a scanner. The VC money loves scanners of
all sorts. Scale out and scale up the profits! But scanners end up
being useless in a lot shorter timeframe than tools that enhance a
professional auditor. Taint/untaint is something that someone wrote an
emacs plugin for years ago, but it'll never see funding. That's good
for bug hunters. :>

Anyone doing a service is just retardedly trying to justify a business
model. Hopefully no one is actually thinking about that out loud. Like I
said above, KLOCwork seemed to be better at being a guidance tool for
manual review. There are several up-and-coming startups in that area from
what I hear. Also, I believe findbugs has a few sources of funding -- they
were looking to hire a full-time product manager type person 6 months ago
or so.

The whole market will be commodotized in a few years by open source things
like findbugs, bugreport, pmd, gendarme, etc. Wise VCs are staying out of
this first round and are waiting for history to repeat itself like it did with
network scanners, IDSes, etc. It's the same cycle. Again. Many people
don't want to hear it, and more power to them. At that point what will
these companies do without a diversified procuct portfolio? Go into
services? Haw.

The best bet for these companies is to open source their
products now so they become the de-facto standard like Nessus, Snort, etc.
If they're afraid to do that, whether because everyone will see how bad
their analysis or code quality is, they've already lost and are just
wasting investor money at this point. The customers are already up in arms
due to the wildly varying quality of even minor fix releases. Users can't
deploy fixes because of new false positives and negatives that get
introduced. As one user of several tools said to me, "We're not the beta
tester -- we're the QA. It's like they don't test *anything* before they
ship." It's practically like open source software already in
this market, might as well go all the way.

This is all my highly uneducated opinion, of course.


--
tangled strands of DNA explain the way that I behave.
http://www.clock.org/~matt
_______________________________________________
Dailydave mailing list
Dailydave () lists immunitysec com
http://lists.immunitysec.com/mailman/listinfo/dailydave


Current thread: