Interesting People mailing list archives

IP: 2 on "Buffer Overflow" security problems: [risks] Risks Digest 21.84


From: David Farber <dave () farber net>
Date: Sat, 05 Jan 2002 18:06:56 -0500


------------------------------

Date: Wed, 26 Dec 2001 21:19:22 -0800
From: Henry Baker <hbaker1 () pipeline com>
Subject: "Buffer Overflow" security problems

I'm no fan of lawyers or litigation, but it's high time that someone defined
"buffer overflow" as being equal to "gross criminal negligence".

Unlike many other software problems, this problem has had a known cure since
at least PL/I in the 1960's, where it was called an "array bounds
exception".  In my early programming days, I spent quite a number of unpaid
overtime nights debugging "array bounds exceptions" from "core dumps" to
avoid the even worse problems which would result from not checking the array
bounds.

I then spent several years of my life inventing "real-time garbage
collection", so that no software -- including embedded systems software --
would ever again have to be without such basic software error checks.

During the subsequent 25 years I have seen the incredible havoc wreaked upon
the world by "buffer overflows" and their cousins, and continue to be amazed
by the complete idiots who run the world's largest software organizations,
and who hire the bulk of the computer science Ph.D.'s.  These people _know_
better, but they don't care!

I asked the CEO of a high-tech company whose products are used by a large
fraction of you about this issue and why no one was willing to spend any
money or effort to fix these problems, and his response was that "the
records of our customer service department show very few complaints about
software crashes due to buffer overflows and the like".  Of course not, you
idiot!  The software developers turned off all the checks so they wouldn't
be bugged by the customer service department!

The C language (invented by Bell Labs -- the people who were supposed to be
building products with five 9's of reliability -- 99.999%) then taught two
entire generations of programmers to ignore buffer overflows, and nearly
every other exceptional condition, as well.  A famous paper in the
Communications of the ACM found that nearly every Unix command (all written
in C) could be made to fail (sometimes in spectacular ways) if given random
characters ("line noise") as input.  And this after Unix became the de facto
standard for workstations and had been in extensive commercial use for at
least 10 years.  The lauded "Microsoft programming tests" of the 1980's were
designed to weed out anyone who was careful enough to check for buffer
overflows, because they obviously didn't understand and appreciate the
intricacies of the C language.

I'm sorry to be politically incorrect, but for the ACM to then laud "C" and
its inventors as a major advance in computer science has to rank right up
there with Chamberlain's appeasement of Hitler.

If I remove a stop sign and someone is killed in a car accident at that
intersection, I can be sued and perhaps go to jail for contributing to that
accident.  If I lock an exit door in a crowded theater or restaurant that
subsequently burns, I face lawsuits and jail time.  If I remove or disable
the fire extinguishers in a public building, I again face lawsuits and jail
time.  If I remove the shrouding from a gear train or a belt in a factory, I
(and my company) face huge OSHA fines and lawsuits.  If I remove array
bounds checks from my software, I will get a raise and additional stock
options due to the improved "performance" and decreased number of calls from
customer service.  I will also be promoted, so I can then make sure that
none of my reports will check array bounds, either.

The most basic safeguards found in "professional engineering" are cavalierly
and routinely ignored in the software field.  Software people would never
drive to the office if building engineers and automotive engineers were as
cavalier about buildings and autos as the software "engineer" is about his
software.

I have been told that one of the reasons for the longevity of the Roman
bridges is that their designers had to stand under them when they were first
used.  It may be time to put a similar discipline into the software field.

If buffer overflows are ever controlled, it won't be due to mere crashes,
but due to their making systems vulnerable to hackers.  Software crashes due
to mere incompetence apparently don't raise any eyebrows, because no one
wants to fault the incompetent programmer (and his incompetent boss).  So we
have to conjure up "bad guys" as "boogie men" in (hopefully) far-distant
lands who "hack our systems", rather than noticing that in pointing one
finger at the hacker, we still have three fingers pointed at ourselves.

I know that it is my fate to be killed in a (real) crash due to a buffer
overflow software bug.  I feel like some of the NASA engineers before the
Challenger disaster.  I'm tired of being right.  Let's stop the madness and
fix the problem -- it's far worse, and caused far more damage than any Y2K
bug, and yet the solution is far easier.

Cassandra, aka Henry Baker <hbaker1 () pipeline com>

------------------------------

Date: Wed, 26 Dec 2001 21:19:22 -0800
From: Peter G Neumann <Neumann () CSL sri com>
Subject: "Buffer Overflow" security problems (Re: Baker, RISKS-21.84)

Henry, Please remember that an expressive programming language that prevents
you from doing bad things would with very high probability be misused even
by very good programmers and especially by programmers who eschew
discipline; and use of a badly designed programming language can result in
excellent programs if done wisely and carefully.  Besides, buffer overflows
are just one symptom.  There are still lots of lessons to be learned from an
historical examination of Fortran, Pascal, Euclid, Ada, PL/I, C, C++, Java,
etc.

Perhaps in defense of Ken Thompson and Dennis Ritchie, C (and Unix, for that
matter) was created not for masses of incompetent programmers, but for Ken
and Dennis and a few immediate colleagues.  That it is being used by so many
people is not the fault of Ken and Dennis.  So, as usual in RISKS cases,
blame needs to be much more widely distributed than it first appears.  And
pursuing Henry's name the blame game, whom should we blame for Microsoft
systems used unwisely in life- and mission-critical applications?  OS
developers?  Application programmers?  Programming language developers?
Users?  The U.S. Navy?  Remember the unchecked divide-by-zero in an
application that left the U.S.S. Yorktown missile cruiser dead in the water
for 2.75 hours (RISKS-19.88 to 94).  The shrinkwrap might disclaim liability
for critical uses, but that does not stop fools from rushing in.

Nothing in the foregoing to the contrary notwithstanding, it would be very
helpful if designers of modern programming languages, operating systems, and
application software would more judiciously observe the principles that we
have known and loved lo these many years (and that some of us have even
practiced!).  Take a look at my most recent report, on principles and their
potential misapplication, for DARPA's Composable High-Assurance Trustworthy
Systems (CHATS) program, now on my Web site:
http://www.csl.sri.com/neumann/chats2.html


For archives see:
http://www.interesting-people.org/archives/interesting-people/


Current thread: