Secure Coding mailing list archives

Where Does Secure Coding Belong In the Curriculum?


From: pmeunier at cerias.purdue.edu (Pascal Meunier)
Date: Thu, 20 Aug 2009 12:01:20 -0400

On Thu, 20 Aug 2009 11:07:12 -0400
"McGovern, James F (HTSC, IT)" <James.McGovern at thehartford.com> wrote:

Here is where my enterpriseyness will show. I believe the answer to the
question of where secure coding belongs in the curiculum is somewhat
flawed and requires addressing the curiculum holistically.
 
If you go to art school, you are required to study the works of the
masters. You don't attempt to paint a Picasso in the first semester, yet
us IT folks think it is OK to write code before studying the differences
between good code and bad code. If a student never learns good from bad
and over time develops bad habits, then teaching security at ANY stage
later in life is the wrong answer. We need to remix the way IT is taught
in Universities and revisit the fundamentals of how to approach IT as a
whole.
 
My second and conflicting opinion says that Universities shouldn't be
teaching secure code as they won't get it right. 

That's a bold statement.  I've been teaching secure coding for many years
at Purdue.  Nobody has told me before that what I did had a negative impact ;)

Students should
understand the business/economic impact that lack of secure coding
causes. If this is left strictly to Universities, it will most certainly
feel academic (in the bad sense). 

Applying this logic in a different domain, people shouldn't be taught gun safety
until they're on the firing range and have a gun in their hands...  

What you do in practice with the knowledge you acquire anywhere is always up to
you.  No matter the subject, when you're told that you shouldn't "touch a hot
oven", it's academic or feels like watching a movie until you actually "touch
one". However, if you were warned and taught good practices, after the incident
you will quickly practice what you were taught, if you have sense. If you were
never warned then you'll blame your teachers and whoever else for leaving that
out, and you'll have to spend time unlearning the bad practices and learning
what you should have been taught.  In this sense, isn't your second point
contradictory with the first?  In any case, I'd rather not be blamed ;)

A person doesn't become a real IT
professional until they have a few years of real-world experience under
their belts and therefore maybe this is best left to their employers as
part of professional development 

I agree that experience matters so that it validates what people were taught,
and that good practices get applied.  However, given the real-world security
practice in some companies, it seems unlikely that real-world experience alone
would teach people better than academia could (if that's what you meant).  I
believe that the most productive approach at a university level is to equip and
"prime" students so that they know some good practices (e.g., in the area
of secure coding), can perform critical thinking and can keep learning from
their experiences in the real world.

Regards,
Pascal


and/or Master's programs that are
IT-focused but not about the traditional computer-science/software
engineering way of thinking...
 
http://twitter.com/mcgoverntheory
************************************************************
This communication, including attachments, is for the exclusive use of
addressee and may contain proprietary, confidential and/or privileged
information.  If you are not the intended recipient, any use, copying,
disclosure, dissemination or distribution is strictly prohibited.  If you are
not the intended recipient, please notify the sender immediately by return
e-mail, delete this communication and destroy all copies.
************************************************************



Current thread: