Snort mailing list archives

problem with snort 2.01 and disabled rules


From: Michael Scheidell <scheidell () secnap net>
Date: Sat, 2 Aug 2003 05:54:46 -0400 (EDT)

Thanks for fixing that libnet problem with broken libnet-config programs.
Compiling on a FBSD system with flexresp enabled went fine.

Both on a FBSD 4.8 and a legacy 3.51 specifying the --with-libraries and
--with-includes worked.

I have a problem that showed up on snort 1.9 and 2.0 and it involves snort
procesing disabled rules.

Specificly, its in the processing of the disabled 'robots' text rule.

On snort 1.9 and 2.0 I modified all my disabled rules from:

#alert yada yada yada

to
# alert yada yada yada 
and it SEEMED to work. (it stopped processing disabled rules)

however, last night I upgraded to the newly compiled snort 2.01, and
shortly after got these alerts: 

08/02-06:52:49 GMT TCP 64.6882.45:18968 --> 208.237.xxx.xxx:80
[1:1852:3] WEB-MISC robots.txt access

08/02-04:49:06 GMT TCP 66.147.154.3:9791 --> 208.237.xxx.xxx:80
[1:1852:3] WEB-MISC robots.txt access

08/02-03:18:08 GMT TCP 12.148.209.198:19949 --> 208.237.xxx.xxx:80
[1:1852:3] WEB-MISC robots.txt access

rule file shows it disabled: (same rule file I used in 2.0)

 pwd
/usr/local/share/snort/rules
grep robots web-misc.rules

# alert tcp $EXTERNAL_NET any -> $HTTP_SERVERS $HTTP_PORTS (msg:"WEB-MISC
robots.txt access"; flow:to_server,established; uricontent:"/robots.txt";
nocase; reference:nessus,10302; classtype:web-application-activity;
sid:1852; rev:3;)

a grep of access log shows robots txt accesses prior to upgrading to snort
2.01 (and no alerts in snort) and after that, every robots.txt access is
then logged in snort.

(all times adjusted to GMT)

ls -l /usr/local/bin/snort
-rwxr-xr-x  1 root  wheel  1336633 Aug  1 22:05 /usr/local/bin/snort

216.39.48.20 - - [01/Aug/2003:20:16:09 -0400] "GET /robots.txt HTTP/1.1"
200 140 "-" "Scooter/3.2"

(prior to upgrade, no snort alert)

after upgrade, three robots.txt accesses, three alerts.

12.148.209.198 - - [02/Aug/2003:03:18:07 -0400] "GET /robots.txt HTTP/1.1"
200 140 "-" "NPBot (http://www.nameprotect.com/botinfo.html)"
66.147.154.3 - - [02/Aug/2003:04:49:06 -0400] "GET /robots.txt HTTP/1.0"
200 140 "-" "http://www.almaden.ibm.com/cs/crawler   [c01]"
64.68.82.45 - - [02/Aug/2003:06:52:49 -0400] "GET /robots.txt HTTP/1.0"
200 140 "-" "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"

compiled with ./configure --enable-flexresp 
snort.conf:

preprocessor frag2

preprocessor stream4: noinspect, disable_evasion_alerts, ttl_limit 0

preprocessor stream4_reassemble: noalerts

preprocessor http_decode: 80 unicode iis_alt_unicode double_encode \
 iis_flip_slash full_whitespace

preprocessor telnet_decode

snort started thus:

echo "snort_wan"
/usr/local/bin/snort -doDI -m 022 -z \
-c /etc/snort/snort_wan.conf -i $wan -l /var/log/snort_wan \
-F /etc/snort/snort_wan.bpf 2>&1

system is FBSD 4.8, 768 MB ram, IBM x300 1.0 GHZ PIII

a grep of /usr/local/src/snort/rules (new rules) shows that # alert seems
to be the right way to disable a rule, so what am I doing wrong?

# alert tcp $EXTERNAL_NET any -> $HOME_NET $HTTP_PORTS (msg:"WEB-MISC
Lotus Notes .csp script source download attempt";
flow:to_server,established; uricontent:".csp.";
classtype:web-application-attack; sid:2065; rev:1;)

-- 
Michael Scheidell,
Main: 561-368-9561 / www.secnap.net


-------------------------------------------------------
This SF.Net email sponsored by: Free pre-built ASP.NET sites including
Data Reports, E-commerce, Portals, and Forums are available now.
Download today and enter to win an XBOX or Visual Studio .NET.
http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01
_______________________________________________
Snort-users mailing list
Snort-users () lists sourceforge net
Go to this URL to change user options or unsubscribe:
https://lists.sourceforge.net/lists/listinfo/snort-users
Snort-users list archive:
http://www.geocrawler.com/redir-sf.php3?list=snort-users


Current thread: