Secure Coding mailing list archives

Re: Interesting article ZDNet re informal software development quality


From: Crispin Cowan <crispin () immunix com>
Date: Tue, 06 Jan 2004 23:32:02 +0000


Kenneth R. van Wyk wrote:


On Monday 05 January 2004 22:16, Crispin Cowan wrote:
 


In other news, pundits observe that the Linux kernel's development
methodology violates most of the tenets of Brooks' "Mythical Man Month".
Using ludicrous methods such as distributed development by a diverse
team, many of whom have never met in person, and accelerating
development by adding people, analysts observe that this method could
not possibly work, and therefore the Linux kernel must be a figment of
everyone's imagination :)
   

*grin*  That sure isn't what I was trying to imply in passing along the ZDnet 
article.  (Indeed, my own SOHO environment relies almost exclusively on 
Debian -- mostly Sarge and Sid.)  I simply found it interesting that someone 
is doing research into how the less formal process affects the quality of the 
results.  To me, that has nothing to do with an open (or closed) source 
model, but everything to do with the process.  I'm looking forward to reading 
their findings.


Ok, setting aside the stand-up comedy, my serious view on this and other 
similar findings.


The academic discipline of Software Engineering has spent 30 years 
trying to find software development methods that reliably produce 
quality software. This quest is rooted in other engineering disciplines 
(bridge building, steam boiler engineering, etc.). There is a need to 
create a "cook book" of methods that can be handed to *average* (not 
brilliant) programmers so that they can produce every-day software 
artifacts that aren't as fragile as a house of cards.


Software engineering has found a collection of methods (formal methods, 
formal specifications, ISO 9000, Common Criteria, etc.) that appear to 
produce better software artifacts than just slamming together some crap. 
Unfortunately, these methods are *not* satisfactory: not only are they 
outrageously expensive, they also fail the reliability test. While they 
*tend* to produce software that is better than random methods, they 
absolutely do *not* give you the reliable artifact production that 
bridge building and boiler engineering methods do.


None the less, various advocates of these software engineering methods 
treat compliance with their favorite methods as if it were quality 
itself. The cited article complains about the lack of formal design in 
open source software as if it were quality itself, when it is no such 
thing. There is not even a causal relation in *either* direction: 
quality software can exist without formal methods, and formal methods 
can produce crap software.


IMHO, this article makes the grave error of assuming that the author's 
favorite methods are the only viable path to software quality. It 
ignores the empirical result that open source's apparently ad hoc 
methods are quite capable of producing quality software artifacts. 
Rather than whinge about lack of compliance with failed methods, 
software engineering researchers would do well to study the methods used 
by successful open source projects, and attempt to abstract new methods 
that may help to produce quality artifacts. Arguably, newer methods such 
as XP (Extreme Programming) are doing exactly that.


Crispin

--
Crispin Cowan, Ph.D.  http://immunix.com/~crispin/
CTO, Immunix          http://immunix.com
Immunix 7.3           http://www.immunix.com/shop/









Current thread: