Secure Coding mailing list archives

Economics of Software Vulnerabilities


From: gem at cigital.com (Gary McGraw)
Date: Tue, 13 Mar 2007 07:29:51 -0400

Hi crispy,

I'm not sure vista is bombing because of good quality.   That certainly would be ironic.   

Word on the "way down in the guts" street is that vista is too many things cobbled together into one big kinda 
functioning mess.  My bet is that Vista SP2 will be a completely different beast.

gem

company www.cigital.com
podcast www.cigital.com/silverbullet
blog www.cigital.com/justiceleague
book www.swsec.com
 

 -----Original Message-----
From:   Crispin Cowan [mailto:crispin at novell.com]
Sent:   Mon Mar 12 20:45:43 2007
To:     Ed Reed
Cc:     sc-l at securecoding.org
Subject:        Re: [SC-L] Economics of Software Vulnerabilities

Ed Reed wrote:
For a long time I thought that software product liability would
eventually be forced onto developers in response to their long-term
failure to take responsibility for their shoddy code.  I was mistaken. 
The pool of producers (i.e., the software industry) is probably too
small for such blunt economic policy to work.
  
I'm not sure about the size of the pool. I think it is more about the
amount of leverage that can be put on software:

    * It is trivial for some guy in a basement to produce a popular
      piece of open source software, which ends up being used as a
      controlling piece of a nuclear reactor, jet airplane, or
      automobile, and when it fails, $millions or $billions of damages
      result. The software author has no where near the resources to pay
      the damage, or even the insurance premiums on the potential damage.
    * In contrast, with physical stuff it is usually the case that the
      ability to cause huge damage requires huge capital in the first
      place, such as building nuclear reactors, jet planes, and cars.

With this kind of leverage, the software producers don't have the
resources to take responsibility, and so strict liability applied to
authors reduces to "don't produce software" unless, possibly, you work
for a very large corporation with deep pockets. Even then, corporate
bean counters would likely prevent you from writing any software because
the potential liability is so large.

It appears, now, that producers will not be regulated, but rather users
and consumers.  SOX, HIPAA, BASEL II, etc. are all about regulating
already well-established business practices that just happen to be
incorporating more software into their operations. 
  
Much like the gun industry. Powerful, deadly tools that, if used
inappropriately, can cause huge damage.

"Use appropriately" may be part of the key here. If you use your car
improperly and kill people as a result of e.g. your drunk driving, then
the car maker is not responsible. OTOH, if the design of your top-heavy
SUV combined with crappy tires results in rollovers, then courts do hold
the vendors responsible.

The problem with software: what is "appropriate"? Conceptually, that the
software in question has been sufficiently vetted for quality to justify
the risk involved. Efforts to do that kind of thing are used in select
industries (nukes and planes) but not widely, because the cost of
vetting is huge, so it only is used when the liabilities are huge.

Why? Because software metrics suck. 30 years of software engineering
research, and LOC is still arguably one of the best metrics of software
complexity, and there is almost nothing usable as a metric for software
quality.

It is not that no one has tried; lots of R&D goes into software
engineering. Its not that there are no new ideas; lots of those abound.
Its not that there has been no advances in understanding; we know a lot
more about the problem than we used to.

I think it is just that it is a hard problem.

Software, by its nature, is vastly more complex per pound :) ^W^W per
unit person effort than any other artifact mankind has ever produced.
One developer in one month can produce a functional software artifact
that it would take a hundred people 10 years to verify as safe. With
those ratios, this problem will not fall easily.

But as with other "serious" security policy formulations - the
technology is irrelevant.  The policies, whether SOX or Multi-level
Security, are intended to protect information of vital importance to the
organization.  If technical controls are adequate to enforce them -
fine.  If not, that in no way absolves the enterprise of the need to
provide adequate controls.
  
Sure it does :) Just show that your organization performed "due
diligence" that is up to "industry standards" and the fact that you
failed pretty much does absolve you, in the eyes of the likes of SOX and
Basil.

It is a very interesting transition from trying to hold software vendors
liable to trying to hold deploying organizations liable, but this first
round of regulation looks like a sinecure for compliance consultants and
a few specialty vendors,and not much else.

The computer software industry has lost its way.  It appears to be
satisfied with prodding and encouraging software developers to develop
some modicum of shame for the shoddy quality of their output.  Feed the
beast, and support rampant featurism - its what's made so many people
rich, after all.
  
The consumers who chose feature-rich over high-quality did that, not the
software industry.

In the long run, though, featurism without quality is not sustainable. 
That is certainly true, and I applaud efforts to encourage developers to
rise up from their primordial ooze and embrace the next steps in sane
programming (we HAVE largely stamped out self-modifying code, but
strcpy() is still a problem...)
  
I beg to differ. There is no evidence at all that the "good enough"
modality is not sustainable. Read Vernor Vinge's "A Deepness in the Sky"
for a fascinating vision on what 10,000 years of shoddy software
development could produce. And its a damn fine book.

What's most disappointing to me is the near-total lack of discussion
about security policies and models in the whole computer security field,
today.
  
I see the policy field growing, albeit slowly. SELinux and AppArmor are
getting traction now, and 5 years ago they were exotic toys for weirdos.

If engineering is the practice of applying the logic and proofs provided
by science to real world situations, software engineering and computer
science seem simply to have closed their eyes to the question of system
security and internal controls.

Perhaps economics will reinvigorate the discussion in the coming decades.
  
I view this as completely ironic. It was economics that forced the
software industry to close its eyes to formalism and quality. the
industry won't change until economics make quality matter more than
features, and I have yet to see any hint of that happening. For example,
Microsoft Vista is:

    * Much better code quality: MS invested heavily in both automated
      and human code checking before shipping.
    * Feature-poor: they pulled back on most of the interesting
      features, and as a result Vista is fundamentally XP++ with a
      pretty 3D GUI.
    * A year or two late.
    * Bombing in the market: the street chat I see is enterprises doing
      anything possible to avoid upgrading to Vista.

So it seems that even mighty Microsoft, when they try for quality over
features, just gets punished in the market place.

Crispin

-- 
Crispin Cowan, Ph.D.                      http://crispincowan.com/~crispin/
Director of Software Engineering, Novell  http://novell.com
     Hacking is exploiting the gap between "intent" and "implementation"

_______________________________________________
Secure Coding mailing list (SC-L) SC-L at securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
_______________________________________________




----------------------------------------------------------------------------
This electronic message transmission contains information that may be
confidential or privileged.  The information contained herein is intended
solely for the recipient and use by any other party is not authorized.  If
you are not the intended recipient (or otherwise authorized to receive this
message by the intended recipient), any disclosure, copying, distribution or
use of the contents of the information is prohibited.  If you have received
this electronic message transmission in error, please contact the sender by
reply email and delete all copies of this message.  Cigital, Inc. accepts no
responsibility for any loss or damage resulting directly or indirectly from
the use of this email or its contents.
Thank You.
----------------------------------------------------------------------------



Current thread: