WebApp Sec mailing list archives
Fwd: Combatting automated download of dynamic websites?
From: Mark Quinn <cheeky.mini () gmail com>
Date: Wed, 31 Aug 2005 14:14:23 +0100
How about placing a hidden link (around a 1x1 transparent pixel), and get anyone who "clicks" on it banned?
it would just have to be coupled with a robots.txt file to make sure that the legitimate spiders didnt follow it
On a related note, the same technique is useful to keep out Evil Robots such as spam harvesters and exploit locater scripts which generally don't respect robots.txt anyway - just banninate by IP or perhaps UserAgent (if done carefully) if you get a hit on the dissalowed link.
Current thread:
- Re: Combatting automated download of dynamic websites?, (continued)
- Re: Combatting automated download of dynamic websites? bugtraq (Aug 29)
- Re: Combatting automated download of dynamic websites? Matthijs R. Koot (Aug 29)
- Re: Combatting automated download of dynamic websites? Javier Fernandez-Sanguino (Aug 30)
- Re: Combatting automated download of dynamic websites? Eoin Keary (Aug 31)
- Re: Combatting automated download of dynamic websites? Javier Fernandez-Sanguino (Sep 05)
- Re: Combatting automated download of dynamic websites? Matthijs R. Koot (Aug 29)
- Re: Combatting automated download of dynamic websites? Michael Boman (Aug 30)
- Re: Combatting automated download of dynamic websites? Paul M. (Sep 05)
- Re: Combatting automated download of dynamic websites? Eoin Keary (Sep 07)
- Re: Combatting automated download of dynamic websites? bugtraq (Aug 29)
- Message not available
- Fwd: Combatting automated download of dynamic websites? Mark Quinn (Aug 31)