Secure Coding mailing list archives
Tools: Evaluation Criteria
From: peter.amey at praxis-his.com (Peter Amey)
Date: Wed, 23 May 2007 16:27:09 +0100
[snip]
Good to see that folks are expanding the criteria in terms of what it scans for, but criteria as to how it integrates is also equally useful.
On the contrary I find the idea of evaluating tools by "what they scan for" very disturbing. It shows a continuing belief that software engineering involves building things then "scanning" them for defects. We /must/ move to tools and methods that help us construct correct programs rather than looking for defects in them afterwards. Let me give you an example from my previous aeronautical career. The DeHavilland Comet was the world's first transatlantic jet airliner. All went well until after a year of two of service there were a series of catastrophic airframe failures in flight, all with total loss of those on board. ISTR that there were 3 crashes in all. The design defect that caused the disasters was a combination of square cabin windows and hull pressurisation on each flight. The square windows caused amplified stress at each window corner which, with the cyclic stress changes from pressurisation caused metal fatigue failures and hull loss. Metal fatigue was little understood at the time. Now for the lessons. The aero industry quickly learned about metal fatigue and stress raisers and used that information to design fuselages that did not suffer from the Comet's defects. That is why your airliner window is oval not square. There have been very, very few Comet-style failures since (and they are usually associated with corrosion or poor maintenance). So, the problem was analysed, understood, disseminated and hence eliminated. In the software world we seem to content to build "window squareness detection tools". Some will find 70% of square windows but miss others and produce false alarms in yet other cases. Buffer overflows are the square windows of secure software: we shouldn't be /scanning/ for them but using languages and tools that /prevent/ their introduction. Regards Peter -------------------------------------------------------- Peter Amey BSc ACGI CEng CITP MRAes FBCS CTO (Software Engineering) direct: +44 (0) 1225 823761 mobile: +44 (0) 7774 148336 peter.amey at praxis-his.com Praxis High Integrity Systems Ltd 20 Manvers St, Bath, BA1 1PX, UK t: +44 (0)1225 466991 f: +44 (0)1225 469006 w: www.praxis-his.com -------------------------------------------------------- This email is confidential and intended solely for the use of the individual to whom it is addressed. If you are not the intended recipient, be advised that you have received this email in error and that any use, disclosure, copying or distribution or any action taken or omitted to be taken in reliance on it is strictly prohibited. If you have received this email in error please contact the sender. Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of Praxis. Although this email and any attachments are believed to be free of any virus or other defect, no responsibility is accepted by Praxis or any of its associated companies for any loss or damage arising in any way from the receipt or use thereof. The IT Department at Praxis can be contacted at it.support at praxis-his.com. Praxis High Integrity Systems Ltd: Company Number: 3302507, registered in England and Wales Registered Address: 20 Manvers Street, Bath. BA1 1PX VAT Registered in Great Britain: 682635707
Current thread:
- Tools: Evaluation Criteria Peter Amey (May 22)
- <Possible follow-ups>
- Tools: Evaluation Criteria Peter Amey (May 23)
- Tools: Evaluation Criteria McGovern, James F (HTSC, IT) (May 23)
- Tools: Evaluation Criteria ljknews (May 23)
- Tools: Evaluation Criteria Wall, Kevin (May 24)
- Tools: Evaluation Criteria Gunnar Peterson (May 24)
- Tools: Evaluation Criteria McGovern, James F (HTSC, IT) (May 23)
- Tools: Evaluation Criteria Peter Amey (May 24)