nanog mailing list archives

Re: decreased caching efficiency?


From: "Dana Hudes" <dhudes () hudes org>
Date: Fri, 20 Oct 2000 11:02:42 -0400


Ian
----- Original Message ----- 
From: "Ian Cooper" <icooper () equinix com>
To: "Dana Hudes" <dhudes () hudes org>
Cc: <nanog () merit edu>
Sent: Friday, October 20, 2000 10:33 AM
Subject: Re: decreased caching efficiency?


At 09:44 10/20/00 -0400, Dana Hudes wrote:


And browsers implementing caches in memory or disk are also causing 
copyright violation.  Yet for the most part browser caches are considered a 
Good Thing.

No and yes. The user has a license to display the image on their screen for as long
as they keep the browser window open on the page . Browser cache is good for user because
they may scroll or resize the window and don't have to fetch.


In general I do not want people to have my photos stored in their browser 
cache (much less permanently saved).
I do actually have plans to change around some things in my site to take 
advantage
of browser and network caching (e.g. putting the style sheet in a separate 
file,
ditto the JavaScript and any other constant information I can).

And that saves practically nothing, given that they're very small 
files.  Your choice, of course.

True.  The hope is somehow though to save processing the files not just transferring.


When I switch to CGI-based delivery of images the cache will of course 
become pass-through
since there will be no file to cache just a stream of bytes....

Is the assumption there that by using CGI you'll automatically tweak a 
configuration in a caching proxy?  If so then it's a flawed assumption.

But there is no file to cache? I don't have enough gear to set up a test with squid myself
(and that would only be one cache) but how is the engine to know to cache it? 
My understanding is that CGI-generated content is usually not cached.


Having had a very quick look at your site, it seems a little strange that 
you want to defeat caching of those objects that soak up bandwidth; the 
request to perform "click-through" on the advert suggests that you're using 
the revenue to pay for your bandwidth costs.  (So, one assumes that the 
more the material was cached, the less you'd have to pay, and the less 
you'd have to worry about page impressions.)  I particularly like the way 
that you require my browser to send a Referer field to be allowed to view 
the pictures ;-)

I do indeed use the revenue to pay for bandwidth but the pictures, by and large
(its a work in progress) have been tuned for file size; still takes time to decompress but hey,
what can I do. Also the projected load vs. the bandwidth is such that I have a LOT more room left.  The users get a 
reasaonbly large bitmap in a reasonably small file. ImageMagick is nifty set of programs.  The problem I have is 
pirates who collect images and use them for other purposes.
the pictures...well, I actually don't want them hanging around on the user's disk once the browser is no longer on the 
page.
I haven't figured out how to make that happen other than expiration of 1 minute or something.

I'm working on a system for managing/publishing photo web sites like mine. It will be released as free software when I 
get it all working satisfactory and
maybe I'll write some documentation.


Nice photos though...

Thanks. 

You do point out that while I pay fixed cost for bandwidth (my server is behind a DSL circuit) others might use the 
technology to host where they pay for usage as it occurs. An quandary.





Current thread: