Snort mailing list archives

RE: What am I Protecting Against?


From: "Wilcoxen, Scott" <SWilcoxen () macf com>
Date: Mon, 2 Jun 2003 22:43:04 -0400

<snip>

Now that I've got ACID running, I'm attempting to make sure I
understand
what alerts I'm seeing and why I'm seeing them.  Obvious, ain't it?

So I'm trying to figure out what some rules are actually trying to
protect me against; sometimes, there are references to actual docs that
make this obvious; sometimes, the rule documentation covers it.
However, some rules are still undocumented.  So for example, I give you
SID 1852:
alert tcp $EXTERNAL_NET any -> $HTTP_SERVERS $HTTP_PORTS (msg:"WEB-MISC
robots.txt access"; flow:to_server,established;
uricontent:"/robots.txt"; nocase; reference:nessus,10302;
classtype:web-application-activity; sid:1852; rev:3;)

As I see it, this alerts you of any attempts by anyone to access
/robots.txt on your HTTP server.

So hey, maybe I'm an idiot, but why? Trying to get /robots.txt is a
simple part of any search engine that spiders your site.  _I_ don't see
it as a security issue at all.  Am I missing something?
<snip>

Well, for starters most of the rules include some form of reference to
documentation on that particular rule, either at www.snort.org or
elsewhere.  If you're running ACID it's actually quite simple to look
these up.  In the signature column of your display you'll notice links
to Snort, Nessus, CVE, ICAT, ArachNIDS, and more.  Nessus says the
following about the rule you've just referenced:

Some Web Servers use a file called /robot(s).txt to make search engines
and
any other indexing tools visit their WebPages more frequently and
more efficiently.
By connecting to the server and requesting the /robot(s).txt file, an
attacker may gain additional information about the system they are
attacking.
Such information as, restricted directories, hidden directories, cgi
script
directories and etc. Take special care not to tell the robots not to
index
sensitive directories, since this tells attackers exactly which of your
directories are sensitive.

Basically this tells me that I want to make sure I'm not giving much
away in my robots.txt file.  So let's call this more of an
"informational/heads up" rule than anything else.  

<snip>
My goal is to get to the point that I log all things reasonably
considered intrusions or recon, but to only alert on things that are
actually threats -- in other words, I don't want to know at 2am that
someone's trying to compromise my MS SQL Server, since it's running on
UNIX and isn't MS SQL.  Oh, and it's not available to the net :).

So, you're not running MS SQL.  You could always remove the reference to
sql.rules from your snort.conf file.  Just a thought.  I believe in the
doc's it states that it is pointless to monitor for <insert name of
service> intrusion attempts if you're not running <insert name of
service>.  Also, going back to the rule you mentioned above.  This is
only going to check for this content on packets coming from anywhere on
the external network to your web servers on the ports you've defined in
snort.conf.  So, let's suppose for example you've got a web server up at
192.168.1.2 and at 192.168.1.3.  Furthermore lets assume they are both
listening only on port 80 and that these are public addresses.  If
you've defined $HTTP_SERVERS and $HTTP_PORTS in your snort.conf file
this rule will only look for this content on packets with a destination
address of either 192.168.1.2 or 192.168.1.3 with destination port 80.
So if you've got an smtp server at 192.168.1.4 it won't even check
packets going to your smtp server for "/robots.txt".  The documentation
explains all of this quite nicely.  

I just started using Snort a couple months back and I've spent many
hours reading about various vulnerabilities and researching rules.  Not
all of them are well documented, at which point Google has proven
invaluable.  Yes, it's time consuming, but I know for myself anyways
I'll understand what I'm doing more thoroughly by researching it myself.


You mentioned you're running ACID, so I'm assuming your running MYSQL or
an equivalent on the backend to log your alerts to.  I created two
ruletypes.  One for vulnerabilities I'm definitely patched/secured
against and another for everything else.  My patched ruletype logs to a
second database and I've setup a second instance of ACID to review those
alerts.  It's nice to know what people are trying, even if I'm fairly
confident they aren't going to succeed.  Besides some of the older
rules, especially with IIS web servers, will actually alert me to newer
attacks for which specific rules haven't been written.  The rules
provided on the Snort site are meant to be customized for your
particular environment, not used straight "out of the box" so to speak.
It's taken some time, but I've finally got my rulesets tweaked for my
environment and the "false" alerts I get are relatively minimal.  There
is still a good bit of data for me to analyze everyday, but it's not
outlandish.  And on top of all that, it's been fun getting it to this
point!!  

Sorry to be so long winded,
Hope some of this helps,

Scott S Wilcoxen




-------------------------------------------------------
This SF.net email is sponsored by: eBay
Get office equipment for less on eBay!
http://adfarm.mediaplex.com/ad/ck/711-11697-6916-5
_______________________________________________
Snort-users mailing list
Snort-users () lists sourceforge net
Go to this URL to change user options or unsubscribe:
https://lists.sourceforge.net/lists/listinfo/snort-users
Snort-users list archive:
http://www.geocrawler.com/redir-sf.php3?list=snort-users


Current thread: