Snort mailing list archives

HTTP robot detection?


From: "Sheahan, Paul (PCLN-NW)" <Paul.Sheahan () priceline com>
Date: Thu, 24 Jan 2002 15:54:36 -0500


Anyone have any ideas on this one?

I was wondering if there was a way to make Snort detect someone running an
automated script or robot against a website the way it checks for portscans?
For example, Snort flags traffic as a portscan when there are connections to
a certain number of ports on one host within a certain time period. Is there
way to do this with URLs? For example, so many URLs accessed at one IP
address within a certain time period would be flagged as some sort of
automated tool or robot scanning a site?

Thanks!


_______________________________________________
Snort-users mailing list
Snort-users () lists sourceforge net
Go to this URL to change user options or unsubscribe:
https://lists.sourceforge.net/lists/listinfo/snort-users
Snort-users list archive:
http://www.geocrawler.com/redir-sf.php3?list=snort-users


Current thread: