funsec mailing list archives

RE: Consumer Reports Slammed for Creating 'Test' Viruses


From: Drsolly <drsollyp () drsolly com>
Date: Thu, 24 Aug 2006 13:35:00 +0100 (BST)

We really don't have the data, do we? That is a problem, although how
valid the testing is depends somewhat on how well the products did in
detecting the variants. This whole conversation is somewhat hypothetical
on that account. Assume they actually created 5500 viruses.

That might be your assumption.

Mine, based on past expeience, is that they didn't, and I note that we 
don't have any evidence at all that they did.

I find it hard to believe that they wrote each of these by hand. They must 
have used a program to generate them all. So, the output of the program is 
5,500 files. How do you determine that each of those is a virus?

Really, the only way is to run each sepcimen, and determine that it
infects another file (actually, that doesn't really prove that is it a
virus, because you'd need to show that the other file replicates, but if
it doesn't infect another file, it probably isn't a virus (although it
might be, but this is getting complicated, so I'll stop explaining this at
this point, because we don't want the testing on each file to take very 
long, for reasons I'll explain next)).

How much time would it take, to do this to 5,500 files? Suppose at least
ten minutes per file. So, we're talking about 1,000 hours of very boring
and tedious work. At 8 hours per day, that's four months, full time.

Can it be automated? Not easily. If you run a file that *is* a virus, then 
you've ruined the system your testing on as a test system, and it'll need 
to be reinstalled (via an image process, perhaps). Setting up a system 
that will automated the "Is it a virus" test isn't easy - I know, I did 
one. And we have no evidence that they did this. They didn't mention it in 
the article.

And that's why my assumption, unless I see some evidence otherwise, is 
that they didn't test 5,500 files to detemine if they were viruses.

And so, we really have no idea what they used for this test, but more 
importantly, there's strong reason to beleive that *they* didn't know what 
they were using.

 
 
-One- person suggested retrospective testing?? It's hardly a new
suggestion. There have already been several competent tests using that
method. Rob and I wrote about it in VR in 2000, and I doubt if we
invented it.
I've already admitted to being warmer to it now than I was then. ;-)

Obviously it wasn't new to me either, since I then immediately write
about how I had done it myself in the past.

The BIG BIG advantage of restrospective testing, is that you can be 
certain that the test suite really are viruses, because each of them has 
been verified as a replicating virus.

If I were the victim of this test, and if I wanted to show that the test 
was nonsense, I'd ask for access to these 5,500 files, and I'd analyse 
a small sample of them to see what fraction of the sample really were 
viruses. My expectation wouild be that this wouldn't be 100%, and this 
would invalidate the test. Just how invalid, would require someone to 
analyse all 5,500, of course. 

_______________________________________________
Fun and Misc security discussion for OT posts.
https://linuxbox.org/cgi-bin/mailman/listinfo/funsec
Note: funsec is a public and open mailing list.


Current thread: