IDS mailing list archives

RE: Intrusion Prevention


From: "Golomb, Gary" <GGolomb () enterasys com>
Date: Mon, 6 Jan 2003 16:07:44 -0500


See below.

-----Original Message-----
From: Rick Williams [mailto:rickwi () hotmail com] 
Sent: Wednesday, December 25, 2002 2:30 PM
To: focus-ids () securityfocus com
Subject: Re: Intrusion Prevention

{snip}

...and I personally would not be putting ANY 
product forward to be considered until I had seen what the NSS guys had
to 
say about it.

I am hoping that both Netscreen and Sourcefire will be in the next
edition 
and I have to say that Dragon was off our list of IDS for ANY speed of 
network some time ago due to its constant omission from these reports
(you 
don't have to pay for the 100Mbit IDS reports, they are all on-line for
free 
in full).

EOF
}}}}}} Comments below:


Rick, et al.:

Hi there!

This thread has already brought up some interesting points over the past
two weeks, so I'll make an effort to avoid repeating what has been
stated already. We've all got enough email to catch up on from the
holidays without being redundant! I just wanted to address the comment
made above. 

About someone saying they wouldn't look at Dragon, or ANY IDS for that
matter, because they repeatedly do not participate in a specific test,
is a little un-nerving. Sure, there are industry testing standards which
are undoubtedly the most inclusive and open to peer analysis, such as
OSEC, which I would also be suspicious of anyone who refuses to
participate in. (We can revisit this point later.) But to take a product
that continuously participates successfully in other magazine and
third-party tests, but does not participate in one (or another) specific
test... Well, for me that would raise more red-flags about the test than
the product. If Dragon (or **any** IDS) refused all the tests publicly
available, then of course we could draw some conclusions like you have
alluded to. Please realize that I am not the official spokesperson for
these types of subjects. Just someone who's close enough to have some
thoughts on it... 

There are several issues at work when we discuss testing. First of all -
and most generically - is that getting involved in a test is very time
consuming. You really don't think the people who are doing the testing
are actually taking a few months to learn the ins-and-outs of each
product before testing it, do you? Of course not! I've only seen it done
once, and it was by the same guys who wrote much of the OSEC testing
standard. (Incidentally, that test took almost a year to complete!)
Anyways, for every test we participate in, we need to send people from
our Dev, QA, or R&D teams to assist on-site with the testing cycle.
Those are resources that are being taken away form their normal schedule
to assist the testing team in question. Since there are MANY tests
completed every year, we have to carefully choose which to participate
in, and which not to. 

Time and resources are not the only factor involved in selecting which
tests to (or not to) participate in. Generally, we are given a
description of the testing methodology upfront. Believe it or not,
sometimes we're told that we cannot see the testing methodology upfront.
This dumbfounds me for all the reasons that MJR (and others) already
brought up. IDS testing is too easy to inadvertently (and sometimes
intentionally - read: Miercom's test of Intrusion.com) skew the results
of any test. When I think of testing, I think of scientific process.
Unfortunately, many of the IDS tests you read each year do not adhere to
any sort of process, much less an actual scientific methodology. If a
third-party testing group tells us that we are not allowed to view the
test process we'll be subjected to, then we will probably reject the
offer to test - because of all the terribly flawed test plans we have
already seen to date. For many tests we have seen, the ENTIRE test plan
reflects the tester's understanding of one particular facet of IDS
technologies/methodologies (or worse, lack thereof). This is not a bad
thing, if the test is billed as only testing that acute area in the
first place. Unfortunately, they frequently do not. 

This is one of the greatest promises of OSEC. It was written by a large
group of people who have backgrounds in network-hardware performance
testing, blackbox testing, pen testing, IDS development, IDS evasion
development, etc... Additionally, if there is something that you or I
don't like about it, there is an entire community there to hear ideas
for improvements, not just one or two people from a privately-held
organization. And that's one of the most significant differences to
begin with! Every aspect of the test is available to anyone who wishes
to see it, and if you don't like what you see, you can get on here (or
go to them directly) and speak you mind about the test, results, or
methods. They are helpful and take your input very seriously. At least,
that's the experience I've had.

Now, as far as the NSS test goes... This is not a free test. Not only
does each company have to pay for the tests, they have to pay
additionally for the reports generated by the tests. While the reasons
for this are reasonable, this alone should raise some flags about the
agenda of companies that drive marketing campaigns on results form tests
like these. I'll stop here on this one. 

Also - and more importantly, there have been issues with NSS testing
methodologies. Rather than have my slanted (and VERY strong) opinion on
the subject, look at the tools they use to implement their tests, then
do a search on lists (like this one) to see some of the pros/cons of
using those tools. Put those individual discussions together, and you'll
get a more clear view of the bigger picture here.

Anyways, I think the point has already been made in other emails, but...
Don't base your decisions exclusively on one test - it's too easy to
introduce significant testing methodology flaws into a test; ***DO
NOT*** solely base your decision on test results that are given to you
from a vendor; and if you have any doubts - test it yourself or ask for
other end-users' experiences on a list like this. There are things like
stability, support, and the ability to effectively integrate into your
environment that frequently cannot be discovered without your own
testing. Hopefully the vendors' will respect your question enough to not
skew the conversation. (Right Simon?!)

Anyways, just some thoughts...

-gary


Gary Golomb
Detection Research Engineer
IDS Group
Enterasys Networks
410-312-3194 


Current thread: