Secure Coding mailing list archives

re-writing college books - erm.. ahm...


From: leichter_jerrold at emc.com (Leichter, Jerry)
Date: Mon, 6 Nov 2006 08:22:32 -0500 (EST)

| Most of the incidents in your first paragraph were improved with the
| establishment of laws, regulation bodies, and external testing with a
| stamp of approval.  The Underwriters labaroratory was established to
| ensure that a product was sales worthy to the public due to
| manufacturers focusing on sales and not much on safety.
|  
|  Having a clearing house/body/organization which inspects an
| application for business worthiness, critical machinery functions,
| consumer entertainment or consumer sensitive use (personal planner,
| financial package) might persuade vendors to use security enabled
| tools and software building tools stricktly based upon how difficult
| it is to acheive a worthiness approval. For instance if an application
| is developed in C,C++,C# then an appropriate set of audits and testing
| must be performed to acheive a certification. Since these would be
| more comprehensive than other languages the vendor would have to
| decide which is easier the extra expense of the extensive audits and
| testing vs. using a approved language that does not required a greater
| degree of testing. This would provide software tools vendors the
| incentive to create more security enabled tools to align themselves
| with the requirements of the body delivering the accredication for
| use.
|  
|  The software industry is no different than most other
| industries. There are devices with little risk a garage door opener, a
| radio might have more user risk depending upon how you look at it, and
| then there are very dangerous applicances Irons, Microwaves, Snow
| Blowers.  For software a game, word processor, personal finance
| planner, for a business a financial package.
I agree with what you say .... until the very last paragraph.  The
different in the software industry is that we don't really know yet what
the right standards should be - and it's not even clear to me that there
is any concerted effort to find out.  Yes, it's possible to build
highly reliable software for things like aircraft control.  It's (a)
extremely expensive to do - the techniques we have today are so
expensive as to rule out all but obviously life-critical uses of
software; (b) only give you assurance in certain directions.  In
particular, aircraft control software doesn't have to be exposed to
active, intelligent attackers on the Internet.

All elements of the software development community - developers
themselves, software makers, makers of products based on software, even
most software purchasers and users - have a vested interest in resisting
such laws, regulations, and external bodies.  It crimps their style,
adds to their costs, and has no obvious benefits.  The same, of course,
was true of most other kinds of engineered product development at some
point in their life.  However, it's much easier for everyone to see that
there is something wrong when a boiler blows up because the maker saved
money by using steel that's too brittle than it is to see that a bug in
handling URL's allowed a cross-site scripting attack that allowed
critical information to be stolen.  This leads to much less pressure on
the software industry to fix things.  Things are changing, but we're
running into another problem: Even very simple attempts at regulation
quickly run into lack of the basic science and engineering knowledge
necessary.  Industry standards on boilers could refer to standard tables
of steel strengths.  An industry standard to prevent cross-site
scripting attacks could refer to ... what, exactly?  And, again, there
is the important distinction that boiler failures are due to "lawful"
physical processes, while cross-site scripting prevention failures are
due to intelligent attacks.  Perhaps for that kind of thing we need to
look not at industry standards but at military experience....

                                                        -- Jerry


Current thread: