WebApp Sec mailing list archives

The Right Approach to Web Developer Education


From: simon59 () gmx de
Date: Thu, 1 Jul 2004 08:57:11 +0200 (MEST)

I am of the opinion that if software is buggy, it is not the consumer and
the buyer, not the producer who at the end of the day is at fault. If the
consumer is happy to purchase goods which require a reboot three times a day
and if the same consumer is glad to lose all the business data due to a
software bug and a sloppy service provision, no pressure will come from the
consumer to change things. 
If the consumer pressures legislators to change the framework in which the
industry operates worldwide, the situation will most likely change.  

The problem now becomes financial and legislative, not technical. The
technical solutions exist but it is at present uneconomical to use them
because the customer does not place ANY importance on reliability,
availability and integrity of the computing tools used. 
The only way therefore to enforce security is for the customer to make it
uneconomical not to use the tools available.

A look over the edge of the plate towards manufacturing industry would
indicate what has happened there - it is financially untenable to produce
devices which break down or cause accidents to human beings because the
financial losses incurred in recall and repair, court cases etc... and
manufacturers apply for ISO9000, CE, TÜV, VDE, BSI, UL etc... certification
as they are either required or selling points for the products they sell.
The terms "design for reliability" and "design for manufacturability" are so
old that they are no longer catchwords in quality circles.
This could only happen because customers were no longer prepared to accept
shirts and skirts which fade and shrink on first washing, injury and death
because a component of an automobile fails, machines which break down a few
months after purchase, or electrical equipment which shocks people out of
their boots.

Even in manufacturing, there is still exploitation of loopholes in
legislation around environmental and rights of children by manufacturing in
countries with a poor environmental legislation and human rights history.

The level of quality of goods purchased, both at B2B and consumer level,
varies with the legislation in force in the countries around the world, with
the highest levels being encountered where strong pressure groups have
successfully penalized poor suppliers.

How many software houses need to have security certification of the final
end-to-end solution they sell in order to survive economically?

It is thus apparent that industry understands only the language of money,
the drug which feeds and kills it.

Legal obligations enforced worldwide by independent judges, pressure by
insurance companies to raise premiums for companies which do not have a
system for ensuring that their IT assets provide the right level of security
are thus the recipe needed to make it impossible for a software house which
produces the current level of insecurity to survive economically and force
companies to make use of the available tools.

In manufacturing, Japan showed the world the way to quality. 
Will IT security come from the EU and the Med?

"He who has heard"


Yaakov Yehudi  (30.06.2004  07:10):
 http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160

Copied from ISN News; Since it seems very relevant to this thread.  What do
you think?



By Marcus J. Ranum
ACM Queue vol. 2, no. 4
June 2004 

Security bug? My programming language made me do it! 

Failing Miserably

It doesn't seem that a day goes by without someone announcing a critical
flaw in some crucial piece of software or other. Is software that bad? Are
programmers so inept? What the heck is going on, and why is the problem
getting worse instead of better?

One distressing aspect of software security is that we fundamentally don't
seem to "get it." In the 15 years I've been working the security beat, I
have lost track of the number of times I've seen (and taught) tutorials on
"how to write secure code" or read books on that topic.  
It's clear to me that we're:

* Trying to teach programmers how to write more secure code

* Failing miserably at the task

We're stuck in an endless loop on the education concept. We've been trying
to educate programmers about writing secure code for at least a decade and
it flat-out hasn't worked. While I'm the first to agree that beating one's
head against the wall shows dedication, I am starting to wonder if we've
chosen the wrong wall. What's Plan B?

Indeed, as I write this, I see that Microsoft, Intel, and AMD have jointly
announced a new partnership to help prevent buffer overflows using hardware
controls. In other words, the software quality problem has gotten so bad
that the hardware guys are trying to solve it, too.  
Never mind that lots of processor memory-management units are capable of
marking pages as nonexecutable; it just seems backward to me that we're
trying to solve what is fundamentally a software problem using hardware.
It's not even a generic software problem; it's a runtime environment issue
that's specific to a particular programming language.

Normally, when someone mentions programming languages in an article about
software quality, it's an invitation for everyone to jump in with useful
observations such as, "If we all programmed in [my favorite strongly hyped
programming language], we wouldn't have this problem!" That might be true in
some cases, but it's not reality.

We tried legislating a change of programming languages with Ada back in the
1990s. Remember Ada? That was an expensive disaster. Then we tried getting
everyone to switch to a "sandboxed" environment with Java in the late 1990s,
and it worked better-except that everyone complained about wanting to bypass
the "sandbox" to get file-level access to the local host. In fact, Java
worked so well, Microsoft responded with ActiveX, which bypasses security
entirely by making it easy to blame the user for authorizing bad code to
execute. Please, let's not have any more alternative programming languages
that will solve all our problems!

What's Plan B? I think that Plan B is largely a matter of doing a lot more
work on our compiler and runtime environments, with a focus on making them
embed more support for code quality and error checking.  
We've got to put it "below the radar screen" of the programmer's awareness,
just as we did with compiler optimization, the creation of object code, and
linking. We've done a great job building programming environments that
produce fast executables without a lot of hand-holding from the programmer.
In fact, most programmers today take optimization completely for granted-why
not software security analysis and runtime security, too? For that matter,
why are we still treating security as a separate problem from code quality?
Insecure code is just buggy code!

[...]

-- 
"Sie haben neue Mails!" - Die GMX Toolbar informiert Sie beim Surfen!
Jetzt aktivieren unter http://www.gmx.net/info


Current thread: