Well, there's a right way and a wrong way. Unfortunately ConsumerReports.org didn't know of ESET's NOD32 or the right way to test for unknown viruses either.
Here is what happened.
"To pit the software against novel threats not identified on signature lists, we created 5,500 new virus variants derived from six categories of known viruses, the kind you'd most likely encounter in real life." -ConsumerReports.org
How do they know that these are the kind of viruses you'd most likely encounter in real life? Do they plan to release them? I doubt it, but how do they know? They don't really.
"Our next round of tests helped us identify superior antivirus software by measuring how well each of the products identified new viruses even before their signatures had been downloaded. The antivirus programs did this by using a technique known as heuristics, in which they seek out behaviors rather than signatures." -ConsumerReports.org
There are two problems with their approach. First, writing viruses for this purpose risks the accidental release of new viruses, and second, it doesn't give real world test results.
The proper way to test the heuristics of anti-virus software is called retrospective testing. This means that you take a collection of viruses discovered in a specific period of time and then scan them with scanners that were last updated before that time. For example: I could stop updating a scanner on April 30, then collect samples found between May 1 and July 31. Now when I scan I am testing to see what *real* threats that people are *really* encountering are being detected.
Back in 2001, Igor Muttik from McAfee presented a paper on retrospective testing at the Virus Bulletin Conference.
This paper inspired Andreas Clementi (www.av-comparatives.org) to do just that type of testing. Andreas Marx (www.av-test.org) also performs retrospective testing using the same principals as well. There is a good reason these highly respected test organizations test heuristics in this manner. They both know that writing viruses is not proper behavior and that retrospective testing provides scientific, real world results.
It is a pity that ConsumerReports.org didn't know about the safety and scientific advantages of retrospective testing - it would have lent real world credibility to their results.
Randy Abrams
Director of Technical Education