WebApp Sec mailing list archives

Re: Idea for making SSL more efficient


From: "Frank O'Dwyer" <fod () littlecatZ com>
Date: Sun, 18 Jul 2004 14:36:20 +0100

Kurt Seifried wrote:

You would need to 100% guarentee the proxy doesn't serve old versions of
images, otherwise users will get nasty error messages saying content is
being played with/etc.
That's why I said you have to go direct to the source server as a final check if the hash check fails. That would have to be via SSL if you want to be sure no proxy is involved. If the hash check still fails, then the content has been tampered with, and only then do you raise the alarm. If it doesn't fail, then at least you've got the latest (and correct) image, which is probably a good idea too.

Unless you get MS to back this you gain at most a few % savings of a few
percent (single digits) of the browsers.
Sure, but only from the server's perspective (unless you have control over the browser that your clients use). From the individual client's perspective, it doesn't matter what browsers other people use. Plus if it were built into mozilla/apache and turned out to have a noticeable performance impact, then MS might support it anyway. It's no different than any other speedup idea in that respect (persistent connections, HTTP pipelining, SSL connection caching, ordinary caching, etc - all once upon a time had limited support, and are still not 100%).

Plus all the browsers that do NOT
support it spit up nasty "you are downloading mixed content, it may be
insecure blahblah" messages.
Not if unmodified browsers were made to see an SSL link as I suggested. See earlier remarks about backwards compatibility. That could be done in the HTML itself, or as you've pointed out the server could use the 'User-agent' header.

Cheers,
Frank.


Current thread: