IDS mailing list archives

Re: Intrusion Prevention requirements document


From: Bob Walder <bwalder () spamcop net>
Date: Thu, 10 Nov 2005 07:58:52 +0100

Some excellent points Matt - the key to using replay tools effectively is,
of course, to produce your own PCAPs and variants. That way you make sure
that the ones you want to use are recent enough to test an IPS/IDS properly,
and that you are testing for ability to detect and block an attempt to
exploit a vulnerability, and not just a particular piece of exploit code.

You can always use tools like Metasploit to create your own PCAPs, and then
use those PCAPs in a replay tool to save time - the two types of tool are
not mutually exclusive. Both Metasploit and Canvas are excellent.

Evasion testing is often better done by hand - replay tools tend to be
unpredictable when doing lots of fragmentation, etc.

Volume and load are the areas which are most difficult to test without
specialised equipment

It is worth remembering that those PCAPs supplied with replay applications
are great to use as "baseline" tests, because - being older and "out there"
- you can be sure that every vendor has a signature for them. Also, they
provide a good way to do a quick sanity check to see if your IDS/IPS is
detecting and alerting after installation or policy change.

Bob Walder
The NSS Group


On 8/11/05 12:07, "FinAckSyn" <finacksyn () yahoo co uk> wrote:

Hi VT,

I strongly believe that replay tools are NOT an
effective way to test an IPS:

1)  Replay tools are an unfair way to compare vendors.
 There is no measure as to the speed at which an IPS
vendor responded to the original vulnerability.  These
.pcap files have been around for months or even years,
giving plenty of time for vendors to catch up and
write signatures to stop them.  It's all very well
having an IPS that responds favorably to a replay
tool, but if certain signatures took days, weeks or
even months to write, then this is not a fair way to
compare device A with device B.
It can be difficult to get hold of, but try and get
the 'time to respond' stats from your proposed IPS
vendor.  

2)  Replay tools do not test variants.  Although the
pcap content may reflect a specific exploit, what
about all the other exploits that abuse the same
vulnerability?  For example, the 50+ odd variants of
Blaster, Slammer or Code Red?  These test tools
usually include pcaps for one particular variant only.
 A good freeware tool to test variations is
Metasploit, or if you have spare cash, Canvas is a
worthwhile investment.

3)  Replay tools do not test ability of a device to
withstand volume.  When worm/virus outbreaks happen,
you don't just get a single packet in .pcap fashion
that comes in, trips a signature, and gets blocked.
You usually get several million of them.  This is
where it is important to test any rate-based features
of the device.  Also make sure these rate-based
features don't block valid traffic.  Plenty of
freeware tools available - Apache Benchmark, nmap,
Nessus, hping2 and a SYN Flood tool called Juno.
What's more, these tools do not rely on .pcaps, so are
a lot more real world.  Be warned that replaying an
identical pcap several million times is not an
accurate way to rate-test, seeming all L2/L3
information is identical in each packet (eg no change
in SEQ).  Devices that are good at DDOS tend to be
just as good at withstanding sudden, large propagation
of worms.  

4)  Replay tools do not test IPS avoidance.  Well, you
may get vendor-supplied pcaps like
blaster_with_fragments.pcap, but there are so many
other ways to evade an IPS, and can you be sure that
the vendor supplied .pcap is a reflection of real
world traffic?  Get hold of fragroute, tcpsic, plus
nessus has some good evasion options.

5) Replay tools do not test zero-day protection.  This
is the fun bit.  How do you test a network IPS
protects you against the shape of things to come?
If you have a research team that can generate
signatures to protect you quickly, then great.  But
there's still a window of opportunity during which
your security can be breached.
If an IPS is firing back events such as Sasser.W32/A
blocked, or Code.Red.B.W32/Z, then chances are, it's
not an anomaly based system that will give you much in
the way of zero-day protection.  But if it's firing
events like 'CIFS Field Too Long' or 'HTTP Header
Contains Illegal Characters', then this is indicative
of the machine having good anomaly/zero-day procetion,
rather than specific signatures for specific
pre-historic viral events...  ;)

IPS testing is a big task, and needs big tools - to do
things properly, take a look at the NSS testing suite,
for example -  

http://www.nss.co.uk/utm/appendix_a/appendix_a.htm

In conclusion (I hear sighs of relief...):

* Replay tools cannot be solely relied on to test an
effective IPS 
* Think about what you want your network IPS to do -
content-based checks are important, but equally as
important are access control and rate-based ability.
* A network IPS will never provide 100% perimeter
protection.  Always invest in extra security layers,
especially host-based, to ensure that anything the IPS
lets through does not cause problems
* If you buy an IPS on the merits of tcpreplay
results, you risk being hit with a zero-day threat or
DoS condition, and losing your job.
* Treat any vendor that promotes testing with a replay
tool with caution (I learnt the hard way..)

Hope this helps,

Regards,

Matt




------------------------------------------------------------------------
Test Your IDS

Is your IDS deployed correctly?
Find out quickly and easily by testing it 
with real-world attacks from CORE IMPACT.
Go to http://www.securityfocus.com/sponsor/CoreSecurity_focus-ids_040708 
to learn more.
------------------------------------------------------------------------


Current thread: