Firewall Wizards mailing list archives

Re: Active-content filtering (was RE: Buffer Overruns)


From: Dorian Moore <d () kleber net>
Date: Wed, 29 Dec 1999 13:57:47 +0000


        In my experience, the core problem is that 99% of all Web designers
don't have the clues God gave an Irish setter regarding the security
implications of all their fancy bells and whistles.

The fancy bells and whistles don't belong to the web designers : the
issue here is the implementation of a client side scripting language.
The majority of web designers wouldn't have the foggiest how to exploit
the security facilites of that scripting language. Good designers should
make sites which degrade gracefully : but the commercial expansion of
the web has made that impossible because budgets drive development, and
clients always want the whizz bang this is glity and don't give a damn
that the users at the end of the line don't have a 500 Mhz PIII machine
for the latest super complex DHTML to run on... 

Ultimately users need to understand that what they are downloading from
a site isn't just information, but is also potentially a program which
affects how their system works - and can have a negative effect on their
system. In the same way users have to be educated that files that they
are emailed, or files that they received on
zip/floppy/syquest/dat/memory stick or any other transfer format, can be
bad as well as good. I think its a good state for mankind to have (in
general) trust in what goes on around them ... if it wasn't for those
pesky kids.

The problem with educating users is that when you look at a website your
interpretation of it is subjective. Take http://www.smile.co.uk - an
online bank. If I see that website I think it looks like it was designed
by a 12 year old for two weeks pocket money : so I won't trust them with
my money (even though I trust the bank that supports it). However that's
because I'm technologically, and asthetically aware of what should be
done with the technologies. That doesn't means a site can get past me by
looking good, but there are no standards to define how far you can trust
a site : apart from secure certificates (and I'm not going to get into
exploting those...).. and the majority of lusers don't know Verisign
from Thawte, and will follow like the bibile what is written on a
website (install new signing authority... if you want me to) so again
how does that help?

I've gotten in arguments
with more than one on-line marketing rag columnist who has never, ever
heard of any kinds of problems with active content. They simply have no idea
whatsoever that these scripting languages put their customers' clients at
risk. It doesn't even register.

Because the majority of producers assume that what they are given is a
stable platform to work in and are too busy being concerned about the
rediculous deadlines built upon the hype of the web being a simple
medium which is easy and quick to publish on. Yes they should be aware
of the security issues, but the web wasn't introduced that way, and
client side scripting has been a race forced by consumer and commercial
desire (with a bit of microsoft vs netscape competition thrown in for
good measure). As a developer within that field I often argue away from
the use of these technologies : but ultimately the people want it, and
the clients see their competition doing it : so why _can't_ we .... if
there was a legal precident for a website downloading damaging scripts
(your website crashed my computer) it would be a different story, but
until then (IMHO) it's a matter of peer group pressure...

        The New York Times site is the worst. Not only does it require cookies,
but if you want to complain about THAT practice, their form letter is
Javascript
driven.

I think you've got a confusued issue here. The new york times is a
website providing content for free, that they have to spend time and
money producing. Hell if they want to use cookies that's up to them,
they are providing a service in return, and they use the cookies to make
that service more viable for them to run (to store ID info to track your
round the site) ... It's not ideal, but I think you have to provide them
with some leway there. Sure you can percieve it as and intrusion to your
privacy, but you have to allow that in the first place.

Javascript form, well hell yes that's stupid. But there is no perfect
solution to the feedback on the web situation, as there are a hell of a
lot of lusers out there who can't even get their email address's right!

        Educating the designers is only part of the problem. Making their
clients aware of how they could be hurt, so that they bring pressure to
bear from their side, is also necessary.

And understanding that the commercial aspect of the web is essential if
you are using it as a tool in developing business. Unfortunatly no
matter how many people you educate there is still going to be someone
who wants to do what you tell them not to... wether it's because it
shouldn't be, or it can't be done. And if it can't be done and they do
it: that's progress as far as I am concerned.

We're working in a developing medium and to help it develop you can't
cast a 'bad' label onto everything : Yes some content is bad, but just
because you once bit into a rotten apple it doesn't stop you from eating
them ever again (though you might not trust the person you got it from). 

Just my opinion...

d.

-- 
Techie wanted, apply within : http://www.kleber.net/job.html

Dorian Moore is property of Kleber Design Ltd. If found please contact Kleber
by phone on +44 207 581 1362 or visit http://www.kleber.net for further details.
You really shouldn't listen to anything he says... as it may just be an opinion



Current thread: