PaulDotCom mailing list archives

No subject


From: bogus () does not exist com ()
Date: Tue, 04 Aug 2009 23:25:38 -0000

each economic calculation. In the example of lottery tickets, it's the
fantasy of extreme wealth. Depending on the culture and the personality,
humiliation avoidance carries a high 'X' factor premium. Regulatory
compliance or contract requirements can be a huge 'X' factor. Fear can be a
huge 'X' factor. Each person has their own 'X' factor of concerns. Our job
is to identify those concerns, address them and make appropriate
recommendations.

An interesting example is the money and effort spent on preventing terrorism
on airplanes. From a strictly economic perspective, it doesn't make sense to
put our current safe guards in place. However, few people would recommend
removing the security controls in place when they are getting ready to board
a plane. That's an 'X' factor.

I question the probability of an end user having a problem being slight.
When you look at the rapid spread of malware, the growth of botnets,
identity theft and banking trojans, there is a lot going on. Most end users
do not realize the threats that are occurring, or the actual frequency of
the attacks. These are the people we need to educate.

Can we reach everyone? No. There are *always* irrational people. In my
working life, I've encountered several people that simply cannot be reasoned
with for various reasons. When they are 'worker bees' they can be dealt with
by management. When they are management, nature seems to take care of things
in due time (and you don't want to be around when it happens).

Bart

On Mon, Feb 15, 2010 at 12:43 PM, Jack Daniel <jackadaniel at gmail.com> wrote:

But that's the point, the actual risk to the end user is negligible if
you do the math- the costs of being hit are often low, but even if
they were high, the chance of compromise is so low that the
distributed risk risk is still negligible.  If the aggregated risk is
low per user, then it is economically irrational to take extra
measures to protect yourself.

And- it is not their fault.  They are expected to use fundamentally
insecure (and largely unsecurable, practically speaking) systems, and
"being secure" is not their job, their job is to
produce/sell/whatever.

Jack



On Mon, Feb 15, 2010 at 12:49 PM,  <d4ncingd4n at gmail.com> wrote:
I think it also helps to explain the personal risk to them. If their
computer is used to host kiddie porn they would have to deal with the
*embarrassment* and the risk of being wrongly convicted could destroy their
lives personally and professionally. Identity theft can be inconvenient even
if you have protection. If your company has their ACH account hit for
hundreds of thousands of dollars due to THEIR pc having Zeus or Clampi and
the company folds, you will lose your job.

FUD? I don't think so. You just have to find a way to make it real to
them instead of something you see in the movies or the self-important
delusions of paranoid nerds. (which is how we are sometimes unfortunately
seen)

Bart
Sent from my Verizon Wireless BlackBerry

-----Original Message-----
From: Jack Daniel <jackadaniel at gmail.com>
Date: Mon, 15 Feb 2010 12:22:48
To: PaulDotCom Security Weekly Mailing List<
pauldotcom at mail.pauldotcom.com>
Subject: Re: [Pauldotcom] End user education

I need to craft a longer answer, but I will say the results of user
education programs are very dependent on the end user being taught.  I
have had much better luck with some groups than others.  The car
business. that is definitely a "teaching pigs to sing" experience.
Thanks for the insights Raffi and Jody.

I think we'll be hearing more about this topic ;)

Jack


On Sun, Feb 14, 2010 at 9:17 PM, Raffi Jamgotchian
<raffi at flossyourmind.com> wrote:
Jack,

I used to feel the same way that you did only a few years ago.  I think
it
was particularly because our security program from the larger
corporation I
came from was ineffective. The problem with giving up on the end-user is
that you end up with spending too much time and money on tools. I know
those
things are not necessarily items that are exclusive of each other but
hear
me out.

When I was asked to be CTO of a small investment firm startup (after I
left
larger investment firm noted above), I agreed to every security startup
that
I met that I would put their product into my environment at no or low
cost
in return for feedback to them and them allowing to use our company name
in
their marketing.  Besides finding myself becoming somewhat of a tech
whore
(sorry if that offends), I found that I was spending too much time
overcomplicating the environment which led to other issues. Both of
those
left a bad taste in my mouth so I made a conscious switch.

Since then, I've moved into a consulting role with the same firm as well
as
a few other small investment and non-investment firms.  I've found that
by
spending one on one time about the consequences in addition to pragmatic
controls is the best defense we have today. Small business typically
don't
have the resources to spend oodles of money on tools and people so they
have
to do, as Mick said at ShmooCon, "secure enough."

The church I go to has a prototypical very conservative Armenian priest.
His sermons are super long and are said in two languages (Armenian and
English).  When he wants to teach or preach to a point, he says the same
thing three different ways, and then again in both languages. Now
someone
that understands both languages got the same lesson 6 times.  Guess
what, it
eventually sinks in.  Although we like to treat employees like adults,
and
we expect them to behave that way, the truth is, that most adults (like
Kindergarteners) need repetition in different ways to properly learn.
 As
security practitioners (and I'll speak to the small business market
since
that's what I focus on now a days) we need to be equal parts
technologists
to minimize the breakage when things happen but also teach the business
consequences of the actions people make.  If you work the consequences
into
the conversations in different ways repetitiously, it does eventually
sink
in, but it doesn't happen overnight.

Thanks for sending those links over. I'm always interested in seeing
what
others feel about this since my position is an evolving one.

-----Original Message-----
From: pauldotcom-bounces at mail.pauldotcom.com
[mailto:pauldotcom-bounces at mail.pauldotcom.com] On Behalf Of Jack
Daniel
Sent: Sunday, February 14, 2010 2:17 PM
To: PaulDotCom Security Weekly Mailing List
Subject: [Pauldotcom] End user education

You've probably all seen Larry's fudsec post at
http://fudsec.com/casual-hex-and-the-failure-of-security-awaren (You
haven't? Go now, and make sure you read the comments).  I think it is a
good
starting point for a conversation we need to have in InfoSec.

I have largely lined up with the dinosaurs like Ranum in my skepticism
of
the value of user education, but have tried anyway.  I almost always
come
back to Robert Heinlein's quote: "Never try to teach a pig to sing; it
wastes your time and it annoys the pig."  We do get some successes, but
at
what cost?

A more informed look at the education we give end users, and the reasons
that they should reject the advice, is found in a paper Cormac Herley
delivered last year.  I read it when it came out, and keep going back to
it.
It isn't very long, but it isn't really a light read, either.  PDF is at

http://research.microsoft.com/users/cormac/papers/2009/SoLongAndNoThanks.pdf

You may notice that this is focused on the home user, not the corporate
end
user- that is on purpose, there just isn't enough data to extrapolate
conclusions with the level of detail he wanted.  Cormac has observed
that
end users in business are rejecting the advice anyway.  I do think the
numbers have to shift significantly when we factor in the costs of
breaches
to organizations and the fact that many fraud protections offered to
individuals do not apply to businesses.  My gut feeling is that
rejecting a
lot of "security advice" still makes economic sense, at least from the
corporate end-user perspective, but the margins are slimmer.

There is also the issue of the true cost of breaches; if I have a
fraudulent
charge on a card I am not out any money *directly*, but we're all paying
double-digit interest rates on credit cards when the prime is below a
percent, partly to cover fraud expenses- and the price of goods includes
an
added margin to cover "shrinkage" (theft, loss, fraud, etc.).  We are
all
paying for the fraud, but the true costs are so obfuscated that we don't
know what the real numbers are.

I'm not sure where we go from here, but I do believe we need to be able
to
honestly answer the question "is it worth it" before we hand out
security
advice and education, especially the same stuff we've been saying for
years.

I think it makes sense to use this information to justify some lockdown
of
corporate assets; if the users can't be relied on to protect the assets
(and
arguably shouldn't have to), then we need to secure them before letting
people loose to do their jobs.

I have exchanged a few emails with Cormac, he has received a pretty good
response to the paper and he is certainly a sharp guy.  Hey, there's a
guest
idea for the podcast...
(Paul's idol, Steve Gibson, even covered this paper, but of course,
didn't
speak to Cormac about it).

Jack


--
______________________________________
Jack Daniel, Reluctant CISSP
http://twitter.com/jack_daniel
http://www.linkedin.com/in/jackadaniel
http://blog.uncommonsensesecurity.com
_______________________________________________
Pauldotcom mailing list
Pauldotcom at mail.pauldotcom.com
http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
Main Web Site: http://pauldotcom.com

_______________________________________________
Pauldotcom mailing list
Pauldotcom at mail.pauldotcom.com
http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
Main Web Site: http://pauldotcom.com




--
______________________________________
Jack Daniel, Reluctant CISSP
http://twitter.com/jack_daniel
http://www.linkedin.com/in/jackadaniel
http://blog.uncommonsensesecurity.com
_______________________________________________
Pauldotcom mailing list
Pauldotcom at mail.pauldotcom.com
http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
Main Web Site: http://pauldotcom.com
_______________________________________________
Pauldotcom mailing list
Pauldotcom at mail.pauldotcom.com
http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
Main Web Site: http://pauldotcom.com




--
______________________________________
Jack Daniel, Reluctant CISSP
http://twitter.com/jack_daniel
http://www.linkedin.com/in/jackadaniel
http://blog.uncommonsensesecurity.com


--0016e6d7e06c30ae3b047fab7c0e
Content-Type: text/html; charset=ISO-8859-1
Content-Transfer-Encoding: quoted-printable

I agree that it doesn&#39;t make sense for the end-user to take certain
high cost/low payoff precautions and it shouldn&#39;t even be suggested.
But, low cost/high payoff solutions (passwords/AV/etc) should be a
no-brainer (and what I believe we are discussing here).<br>
<br>
People are seldom economically rational. They are also bad at
estimating risk. Besides, many people are just plain bad at math. ;)
That&#39;s one reason we have jobs - to recommend appropriate controls to
mitigate actual risks.<br>
<br>


Current thread: