WebApp Sec mailing list archives

Re: Combatting automated download of dynamic websites?


From: "Tony Stahler" <TStahler () tempographics com>
Date: Tue, 30 Aug 2005 11:51:09 -0500

Thats a good idea, it would just have to be coupled with a robots.txt
file to make sure that the legitimate spiders didnt follow it (I believe
he said he wanted google to be able to cache his page, I could be wrong
though, the original post was a day or two ago)

-Tony

Michael Boman <michael.boman () gmail com> 08/30/05 10:53AM >>>
On 8/30/05, Matthijs R. Koot <matthijs () koot biz> wrote:
Thanks for your reply zeno! But actually, referer-based anti
leeching
won't do it for me and mod_throttle isn't suitable for Apache 2. I'm
in
need of a throttling function based on something more advanced like
a
'request history stack' to check the order in which pages were
requested, probably within a certain time period, et cetera. Maybe
it'd
be better to move such security measures into the actual web
application
itself, but I'm still hoping someone knows of a service-based
solution
(i.e. like the beforementioned Apache module).

Matthijs

How about placing a hidden link (around a 1x1 transparent pixel), and
get anyone who "clicks" on it banned?

Best regards
 Michael Boman

-- 
IT Security Researcher & Developer
http://proxy.11a.nu


Current thread: