WebApp Sec mailing list archives

Fwd: Combatting automated download of dynamic websites?


From: Mark Quinn <cheeky.mini () gmail com>
Date: Wed, 31 Aug 2005 14:14:23 +0100

How about placing a hidden link (around a 1x1 transparent pixel), and
get anyone who "clicks" on it banned?
 
 it would just have to be coupled with a robots.txt
file to make sure that the legitimate spiders didnt follow it
 
 On a related note, the same technique is useful to keep out Evil
Robots such as spam harvesters and exploit locater scripts which
generally don't respect robots.txt anyway - just banninate by IP or
perhaps UserAgent (if done carefully) if you get a hit on the
dissalowed link.


Current thread: