funsec mailing list archives

Re: Curious questions...


From: Florian Weimer <fw () deneb enyo de>
Date: Tue, 25 Oct 2005 00:46:37 +0200

* Kowsik Guruswamy:

Questions are as follows:
- How many of you have worked in product development where there was
at least 1 million lines of code (a number pulled out of thin air) to
which you had to contribute? It doesn't matter if it was open source
or commercial.

Does being a Debian Developer qualify? 8-)

- During that process how many 'vulnerabilities' (i.e. bugs) did you
end up introducing? This could be based on automated analysis,
peer-reviews, audits, full-disclosures, etc

One, but mostly due to lack of activity on my part.  I'm more
concerned with *removing* bugs than introducing them. 8-) Writing
usable, secure software is very hard with the tools we usually have at
hand, so I only do it if absolutely necessary.

- What tools did you use to help you find these vulnerabilities?

Peer review.  Actually, the bug was pretty obvious (wrong permissions
on newly created files), so any user would have caught it after some
time.  (I imported upstream code without enough review; I only fixed a
more severe security hole and thought it couldn't get worse.)

The reason for my questions is simple: There seems to be a huge
[technology/awareness] gap between the people that build
software/hardware/systems and the people that find holes in those
systems.

Indeed.  In fact, this is a problem receiving more and more attention.
I recall that this was at the center at a presentation at FIRST 2006,
for example.

What I'm really leading to is, how can we, as people involved in the
security industry, address and fix this gap? Full-disclosure is fine
and dandy, but it doesn't get to the root cause early enough.

First of all, you have to ask yourself how improved software quality
affects your business.  Suppose vulnerabilities in deployed
configurations were significantly harder to spot, and far less common.
Could you still stay in business?  How many people would you have to
lay off (penetration testers, vulnerability research staff, and so
on)?  Do you really want to proceed in that direction?

I believe for a lot of players in the market, the answer is a
resounding "no".  A huge industry is feeding off vulnerabilities in
widely-used software.  I'm sure some of the big names secretly hope
that XP SP2 does not change the vulnerability landscape in the way
that was initially anticipated.

Anyway, let's assume that you work in a niche where vulnerabilities
(or, more precisely, vulnerabilities in software products) do not
matter that much.  As far as I can tell, most people in a similar
situation believe that it's a tools, education, or market issue.

Market is easy to explain.  The basic claim is that people do not
actually want secure products, and that most software companies need
the revenue stream that comes from regular software updates.  As a
result, there is no incentive to create a stable code base which
matures over the years.  Solution seems to be to go to law school, and
after that, lobby politicians to change the framework in which the
market operates.  So far, nobody has really tried this, and the side
effects are somewhat unpredictable.  It could work, though, and
someone should be able to fund it.

Now to tools.  I'm oversimplifying, but basically, you have a choice
between a poor memory model (C), poor concurrency model (Java), or
poor type system (Python & Perl).  There are virtually no static
analysis tools for widely used languages which can be integrated
easily into development processes (mainly due to false positive rate
and cost, I think).  Solution is to study type theory and similar
topics and create better tools.  The downside is that many people
actually did this (there are a number of very neat development
environments that come pretty close to what would be needed), but
their research results didn't have much impact on the way software is
developed.  Their proof-of-concept implementations are in decline, and
few of them are still being maintained.  So let's assume that you've
created a C implementation which does not suffer from buffer
overflows, format string vulnerabilities, and double-free/malloc
corruption bugs *at* *all* and is ABI-compatible with existing C code.
Will anybody use it?  No.  (Actually, there are already two systems
which come pretty close, CCured and Cyclone.)

Education is a bit harder to explain.  The key observation is that
developers are generally smart people, but they still write vulnerable
code.  As you wrote, the basic idea is to create special classes for
them which explain, in a language familiar to software developers,
attack techniques and their relationship to common coding errors.
Other things to explore are security anti-patterns which may introduce
vulnerabilities (for example, security checks and protected operation
on different layers).  From my limited experience in this area, I
think both camps (security guys and developers) have something to
learn from each other.  The security guys most make realistic demands,
based on a real understanding of what the application does, otherwise
they won't be taken seriously.[1] Application developers need to
understand that a conclusive proof that something is a security issue
is often more difficult to obtain than making a change to be on the
conservative side (including complete regression tests).  The list of
things which can be done in this area of education is quite long, and
to my knowledge, nobody has already finished compiling a reasonably
complete one.  I'm a bit skeptical whether all these efforts pay off
in the end, too.  If you are lucky, the general improvement in
software quality and maintainability is already worth it.  On the
other hand, many projects are very short-term (the next release, the
next quarter results, you name it).

[1] For example, most people who complain about electronic voting
    software have never witnessed how it is used in the field.  This
    doesn't mean their criticism is wrong in general, but it certainly
    misses the point in a few details.  Often, such mistakes undermine
    the message you want to bring across.
_______________________________________________
Fun and Misc security discussion for OT posts.
https://linuxbox.org/cgi-bin/mailman/listinfo/funsec
Note: funsec is a public and open mailing list.


Current thread: