COMPARATIVE REVIEW
The 1995 Scanner Top Ten
Looking at the introduction to VB's July 1994 review of DOS-based virus
scanners, one can see that the last six months have been 'business as
usual' for manufacturers.
Since then, some names have disappeared, and new ones have been added.
The number of viruses has continued to climb, and polymorphic virus
detection has been made more urgent by the encounter of two highly
polymorphic viruses in the wild: Smeg.Pathogen and Smeg.Queeg. Although
neither is widespread, they highlight the need for vendors to continue
to improve - the battle with the virus authors is no nearer to completion
than it was then.
This review expands on the theme of polymorphic virus detection, and has
a distinctly 'in the wild' flavour. The Boot Sector, Polymorphic and In
the Wild test-sets have all been extensively revamped, making these the
toughest benchmark tests which products have ever had to undergo.
Testing Protocol
----------------
Products were put through their paces against four test-sets: 'In the
Wild', with 126 samples of file infectors known to be causing a problem
in the 'real world'; a Boot Sector collection, consisting of 11 boot
sector viruses frequently encountered in the wild; the 'Standard'
test-set, consisting of 230 file infectors, and finally, the
'Polymorphic' test-set, which contains a mammoth 4796 infected files.
For full details of the test-sets, see the table at the end of the
article.
Disk scanning speed on an uninfected Bernoulli 90, an uninfected
diskette and an infected diskette was measured for each product. All
tests were performed on a Compaq Deskpro 386/20e, with a 112 MByte hard
disk and 4 Mbytes of memory.
Product speeds are given in kilobytes per second. This represents the
times taken to scan an uninfected Bernoulli 90 containing 1430
executables file spread across 40 directories and occupying 76,049,880
bytes. The test diskettes used were both 1.44 Mbyte 3.5-inch floppy
disks. The clean diskette contained 18 EXE, 19 COM and 12 SYS files,
occupying 1,433,709 bytes. The infected diskette contained 100 EXE and
58 COM files, all infected with either Groove or Coffeeshop, and
occupying 1,413,319 bytes.
All tests were carried out using each product's default settings, except
to turn off any audible alarms when viruses were discovered, or to
allow the product to proceed without user interaction during detection
tests. Overall positioning of products is made by considering only the
total number of viruses detected. The reader is referred to the
appropriate edition of Virus Bulletin for full details of any
product.
AntiVirus+ v4.20.29
In the Wild 90.5%
Boot Sectors 63.6%
Standard 97.4%
Polymorphic 12.6%
An uninspiring set of results from Iris Software, especially against
the boot sector and polymorphic test-sets.
Anyware AntiVirus v2.15
In the Wild 82.5%
Boot Sector 72.7%
Standard 98.3%
Polymorphic 10.4%
A new contender in the Virus Bulletin comparative review, this
Spanish product ships in a box covered with awards from various
magazines. Unfortunately, the scanner contained within it does not
seem to be able to deliver the goods, especially when pitted against
the very tough Polymorphic test-set.
Avast! Version v7.00
In the Wild 99.2%
Boot Sectors 81.8%
Standard 100.0%
Polymorphic 100.0%
An extremely impressive set of results from Avast! puts the product
well in the lead in terms of polymorphic virus detection. However,
these excellent detection results mask a compatibility problem with
the test machine: during scanning of the infected floppy diskettes,
the machine would hang when the product was used in 'multiple
floppy' mode.
AVScan v1.83
In the Wild 100.0%
Boot Sectors 90.9%
Standard 100.0%
Polymorphic 98.2%
H+BEDV's AVScan has always obtained good detection results, and this
set is no exception, earning it a reputation as a reliable and
easy-to-use scanner which gets the job done. The impending release of
an English language version of the company's commercial scanner may
well provide another good choice for corporate use.
Central Point Anti-Virus v2.0
In the Wild 86.5%
Boot Sectors 72.7%
Standard 96.5%
Polymorphic Failed to complete
It is almost becoming a feature of Virus Bulletin comparative reviews
that CPAV is unable to complete the full set of tests. Although the
problem of the product crashing after detecting 256 viruses has now
been fixed, there are certain files which cause the machine either to
pause for an unacceptably long time or to crash during scanning. The
problem was repeatable, and needs to be solved.
Dr Solomon's AVTK v7.03
In the Wild 100.0%
Boot Sector 100.0%
Standard 100.0%
Polymorphic 99.5%
An excellent set of results from this well-respected product, missing
only some samples of Cruncher and Smeg_v0.3. Scanning speeds on an
infected machine are very slow, but clean disks are scanned at a
respectable pace.
F-Prot Professional v2.14a
In the Wild 96.8%
Boot Sectors 100.0%
Standard 99.6%
Polymorphic 94.6%
Like so many other products, the more esoteric polymorphic viruses
caused some problems for F-Prot. Despite this, the product still
performed well overall, remaining close to the top of the field.
IBM Anti-Virus with PC-DOS v6.3
In the Wild 84.1%
Boot Sectors 63.6%
Standard 97.8%
Polymorphic 10.4%
Just like MSAV, the poor performance of this product is chiefly due
to its age: the files on the disk were dated 25/01/94. Like several
other products, IBMAV was unable to scan the Quox-infected diskette.
Results for the commercial release of IBM Anti-Virus are given below.
IBM Anti-Virus v1.07
In the Wild 94.4%
Boot Sectors 100.0%
Standard 99.6%
Polymorphic 55.1%
Considerably improved scores from the version shipped with PC-DOS v6.3,
although polymorphic virus detection is still lacking.
InocuLAN v3.0
In the Wild 89.7%
Boot Sectors 63.6%
Standard 97.4%
Polymorphic 12.6%
The workstation component of InocuLAN uses the same scanning engine
as Iris AntiVirus+, and obtained exactly the same detection results.
Although the product could detect the presence of the Mutation
Engine, very few of the more recent and complex polymorphics were
picked up by the scanner, an important consideration in a NetWare
product.
McAfee Scan v2.1.1
In The Wild 90.5%
Boot Sectors 100.0%
Standard 96.1%
Polymorphic 57.4%
The new version of McAfee Scan looks more like a pre-release copy of
the software rather than a finished product. After installation, the
product refused to run, producing 'Error code 2' in 'Source ph_futil.c,
Location: 1713, Status -1, Information $Revision: 1.15$ Data file not
found.'
This was eventually tracked down to a problem with missing data files
which the installation routine had inadvertently deleted from the disk.
Surely an error message saying that simply and clearly would have been
more appropriate?
The detection results are rather confusing, as the product would under
certain circumstances detect a virus during the scan, but not include
the virus in its summary of the scan results. This is an extremely
serious bug, and one can only hope McAfee Associates has since fixed
it. Version 2.1 of Scan may well be faster, but has a long way to go
if it is ever to reach the top.
Microsoft Anti-Virus
In the Wild 61.1%
Boot Sector 18.2%
Standard 91.3%
Polymorphic 9.7%
Although some of the viruses missed by MSAV were due to the age of
the product (the most recent file was dated 31/05/94), the poor
detection results are still inexcusable. Despite the fact that the
scanner is not the oldest in the test, MSAV has the dubious honour of
being placed last.
Norman NVC v3.43
In the Wild 96.0%
Boot Sector 90.9%
Standard 98.7%
Polymorphic 33.1%
A disappointing score on the Polymorphic test-set mars what is
otherwise a reasonable set of results for Norman Data Defence
System's NVC. The only missed boot sector virus was Quox, where the
product refused to scan the diskette in the drive, reporting 'No
Floppy found in A:'.
Norton AntiVirus v3.0
In the Wild 86.5%
Boot Sector 54.5%
Standard 97.8%
Polymorphic 33.9%
The version of NAV sent in for review was quite old (dated 03/02/94),
which goes some way toward explaining the product's poor detection
results. Note, however, that Virus Bulletin has a strict policy of
reviewing what it is sent: one hopes that buyers receive a more
up-to-date copy.
Novell DOS 7
In the Wild 84.1%
Boot Sector 63.6%
Standard 96.5%
Polymorphic 11.0%
Like the other anti-virus products included with the operating
system, NDOS7's scanner is fearfully out of date, and this is
reflected in its detection results. If one is determined to use the
protection provided with DOS, this result shows that it is vital to
ensure it has been recently updated.
PCVP v2.06
In the Wild 92.9%
Boot Sector 63.6%
Standard 95.7%
Polymorphic 60.6%
Like many other scanners, PCVP was unable to scan the boot sector of
the Quox-infected diskette. Improvements are needed in both the
Polymorphic and Boot Sector test-sets.
Scan Vakzin v4.167
In The Wild 75.4%
Boot Sectors 72.7%
Standard 94.3%
Polymorphic 11.7%
Scan Vakzin is the first ever Japanese scanner to be entered into
a Virus Bulletin comparative review. Boot sector virus detection was
acceptable, although, like several other products, the scanner refused
to recognise the Quox-infected diskette. More seriously, no errors
were reported by the scanner, meaning that a user scanning several
floppy disks might not be aware of the problem. Detection of
polymorphic viruses was also lacking.
Sophos' Sweep v2.66
In the Wild 99.2%
Boot Sectors 81.8%
Standard 100.0%
Polymorphic 78.0%
A solid set of detection results by Sophos' Sweep, although the
product did seem to have problems with some of the newer viruses in
the polymorphic test-set, and was still unable to detect the sample of
the Quox virus, which is known to be in the wild. Still a good
product, but with room for improvement.
ThunderBYTE v6.26
In the Wild 99.2%
Boot Sectors 100.0%
Standard 100.0%
Polymorphic 98.1%
Another set of cracking detection scores from this speedy product - a
combination of good detection and incredible scanning speed puts
ThunderBYTE very close to the top of the pack.
VET v7.825
In The Wild 95.2%
Boot Sectors 100% - but see text
Standard 98.3%
Polymorphic 98.6%
A very good set of test results from this antipodean product. The
only point to note is that two of the boot sector viruses (Natas
and Peanut) were not detected as such, but caused VET to display
the message 'VET does not recognise the boot sector on the disk. It
is probably harmless, but it COULD contain a virus'. This provides
a useful early-warning system for new viruses, but relies on the
developers of VET maintaining a large collection of valid boot
sectors.
Virus Alert v3.24
In the Wild 100.0%
Boot Sectors 81.8%
Standard 100.0%
Polymorphic 74.0%
Virus Alert, produced by Look Software in Canada, sports a bright
cheerful user interface (right down to wishing the user 'Joy Peace
and Happiness' on the closedown screen). In terms of both speed
and detection, it performed extremely well, although polymorphic
virus detection could still be improved. Well worth a second look.
Virus Buster v4.04.01
In the Wild 84.1%
Boot Sector 90.9%
Standard 92.6%
Polymorphic 14.6%
Leprechaun's Virus Buster does not have a clear default mode of
operation. However, timings and detection results are given for the
fast scan mode. If the secure scan is selected, only the detection
for the standard test-set changes, rising to 93.5%. Polymorphic
virus detection was poor.
ViruSafe v6.3
In the Wild 81.0%
Boot Sectors 81.8%
Standard 96.5%
Polymorphic 31.8%
A middle of the road set of detection figures from EliaShim, which
leaves plenty of room for improvement.
Vi-SPY v12 rel 11.94
In the Wild 98.4%
Boot Sectors 81.8%
Standard 100.0%
Polymorphic 76.5%
Vi-SPY, by RG Software, is another product which has a history of
scoring well in VB reviews. This year's results are no exception,
although the product missed two viruses in the wild (Peanut and
Phantom1) and two boot sector viruses (Peanut and Quox). The
product's inability to detect the Quox virus stems from its refusal
to scan the Quox-infected disk, baldly stating that it had
encountered a 'general failure reading drive a:'.
Closing Thoughts
----------------
There is no question about this being the toughest Virus Bulletin
Comparative Review ever carried out. The viruses used in the
polymorphic test-set have been chosen carefully to include several
new samples, in order to highlight those vendors which are keeping
their products completely up to date. Similarly, the boot sector
virus collection has been revamped in order to reflect the changing
threat. These enhancements have stretched the field, leaving one
clear winner in terms of detection.
Every product should detect 100% of the In the Wild test-set: no
excuse should be accepted from any vendor who did not score highly
in this test. In practice, however, only three products achieved
perfect scores in this test.
Speed results are given for a number of different system setups.
Note that all products were run wherever possible in their default
modes, although for testing purposes certain options sometimes had
to be selected.
Perhaps the most critical timing test was that on an uninfected
diskette: this represents one of the most common uses of the scanner.
Scan time under heavily-infected conditions is less critical, though
the reader should bear in mind that, during a large virus outbreak,
there are likely to be a number of infected executable machines which
will need to be cleaned. Too long a scan time on such a machine could
make the product unusable when it is most needed.
Several products have performed so well that they deserve an
individual mention. The top product in terms of virus detection
was Dr Solomon's AntiVirus ToolKit, followed closely by ThunderBYTE,
the second most accurate scanner, and by far the fastest. Other
products which are worthy of praise are Cybec's VET, and Alwil
Software's Avast!, which gained an extremely impressive 100% against
the tricky polymorphic test-set.
If your product is not one of these, and is not at least scoring
highly in the Boot Sector and In the Wild test-sets, you should ask
your vendor to explain why.
In the Wild Boot Sector Standard Polymorphic Overal
(126) (11) (230) (4796) (100)
AntiVirus+ 114 7 224 602 66.0
Anyware 104 8 226 500 66.0
AntiVirus
Avast! 125 9 230 4796 95.3
AVScan 126 10 230 4712 97.3
CPAV 109 8 222 Failed to complete 63.9
Dr Solomon's 126 11 230 4773 99.9
AVTK
F-Prot 122 11 229 4535 97.8
Professional
IBMAV with 106 7 225 500 64.0
PC-DOS
IBMAV v1.07 119 11 229 2643 87.3
InocuLAN....... 113 7 224 602 65.8
McAfee Scan.... 114 11 221 2751 86.0
MSAV........... 77 2 210 466 45.1
Norman Virus... 121 10 227 1586 79.1
Control
Norton......... 109 6 225 1624 68.2
AntiVirus
Novell DOS7.... 106 7 222 526 63.8
PCVP........... 117 7 220 2908 78.2
Scan Vakzin.... 95 8 217 561 63.5
Sweep.......... 125 9 230 3743 89.8
ThunderBYTE.... 125 11 230 4704 99.3
VET............ 120 11 226 4728 98.0
Virus Alert.... 126 9 230 3550 89.0
Virus Buster... 106 10 213 700 70.6
ViruSafe....... 102 9 222 1525 72.8
Vi-SPY......... 124 9 230 3668 89.2
Detection Results: This year's review has left room for every developer
to improve: no-one escaped unscathed! Note that many of the viruses in
the Polymorphic test-set are quite new. This therefore exposes those
vendors who either update their products erratically, or take a long
time to add new virus signatures. Another area where more products
failed this year was the Boot Sector collection: only seven vendors
called 100%.
File Bernoulli Bernoulli Clean disk Infected
dates Read (KB/sec) Speed (m:s) (min:sec) disk (m:s
AntiVirus+..... 19/10/94 20.7 61:09 0:52 3:33
Anyware........ 15/07/94 28.2 44:55 1:37 2:55
AntiVirus
Avast!......... 14/10/94 15.1 83:42 1:17 3:00
AVScan......... 20/10/94 29.5 43:00 1:31 2:07
CPAV........... 23/09/94 Failed Failed 2:15 3:50
Dr Solomon's... 20/09/94 64.7 19:35 0:57 16:35
AVTK
F-Prot......... 30/09/94 66.0 19:13 1:13 2:45
Professional
IBMAV with..... 25/01/94 Failed Failed 1:55 3:52
PC-DOS
IBMAV v1.07.... 05/08/94 15.4 82:15 1:27 3:17
InocuLAN....... 11/08/94 58.7 21:35 0:55 4:57
McAfee Scan.... 17/10/94 36.6 34:39 0:48 4:35
MSAV........... 31/05/94 21.8 58:05 1:14 6:22
Norman Virus... 10/09/94 59.0 21:28 1:22 2:15
Control
Norton......... 03/02/94 76.1 16:40 0:40 2:27
AntiVirus
Novell DOS7.... 26/01/94 36.6 34:40 1:27 7:54
PCVP........... 20/10/94 353.7 3:35 0:37 0:54
Scan Vakzin.... 11/10/94 74.4 17:02 1:11 3:06
Sweep.......... 03/10/94 36.3 34:53 1:15 1:43
ThunderBYTE.... 24/10/94 603.6 2:06 0:23 2:18
VET............ 20/09/94 97.5 13:00 0:55 1:36
Virus Alert.... 27/09/94 585.0 2:10 0:24 2:21
Virus Buster... 07/10/94 38.2 33:12 0:27 9:00
ViruSafe....... 18/10/94 65.3 19:25 0:38 1:35
Vi-SPY......... 20/10/94 32.7 38:47 1:05 6:25
Speed Results: Although speed is not everything, a scanner which is
as unobtrusive as possible is certainly an advantage. Once again,
ThunderBYTE from ESaSS streaked ahead of the rest of the field,
clocking up a monumental scan speed of 603.6 KBytes per second,
without compromising its detection rates.
TEST-SETS:
Boot Sector Test-set: One genuine infection on HD 1.44M 3.5-inch
diskette of:
Natas, Junkie, NoInt, Peanut, BFD-451, AntiEXE.A, Parity_Boot,
Empire.Monkey, Form, Quox, and LZR.
Polymorphic Test-set. 4796 infections of:
Cruncher (25), Girafe (1024), Groove and Coffee_Shop (500),
One_Half (1024), Pathogen (1024), Satan_Bug (100), SMEG_v0.3 (1024),
Uruguay.4 (75)
In the Wild Test-Set. 126 genuine infections of:
4K (Frodo.Frodo.A) (2), Argyle, Athens (2), Barrotes.1310.A (2),
BFD-451, Black_Monday (2), Butterfly, Captain_Trips (2), Cascade.1701,
Cascade.1704, Chill, CMOS1-T1, CMOS1-T2, Coffeeshop (2),
Dark_Avenger.1800.A (2), Dark_Avenger.2100.DI.A (2),
Dark_Avenger.Father (2), Datalock.920.A (2), Dir-II.A, DOSHunter,
Eddie-2.A (2), Fax_Free.Topo, Fichv.2.1, Flip.2153.E (2),
Green_Caterpillar.1575.A (2), Halloechen.A (2), Helloween.1376 (2),
Hidenowt, HLLC.Even_Beeper.A, Jerusalem.1808.Standard (2),
Jerusalem.Anticad (2), Jerusalem.PcVrsDs (2),
Jerusalem.Zerotime.Australian.A (2), Junkie, KAOS4 (2),
Keypress.1232.A (2), Lamer's_Suprise, Liberty.2857.D (2), Loren (2),
Macgyver.2803.B, Maltese_Amoeba (2), Natas, Necros (2), No_Frills.843 (2),
No_Frills.Dudley (2), Nomenklatura (2), Nothing, Nov_17th.855.A (2),
Npox.963.A (2), Old_Yankee.1, Old_Yankee.2, Peanut, Phantom1 (2),
Pitch, Piter.A, Power_Pump.1, Revenge, Screaming_Fist.II.696 (2),
Satan_Bug (2), SBC, Sibel_Sheep (2), Spanish_Telecom (2), Spanz,
Starship (2), SVC.3103.A (2), Syslock.Macho (2), Syslock.Syslock.A,
Tequila, Todor (2), Tremor (5), Vacsina.Penza.700 (2), Vacsina.TP.5.A,
Vienna.627.A, Vienna.648.A, Vienna.W-13.534.A, Vienna.W-13.507.B,
Virdem.1336.English, Warrier, Warrior, Whale, XPEH.4928 (2).
Standard Test-set. 230 genuine infections of:
1049, 1260, 12_Tricks, 1575, 1600, 2100 (2), 2144 (2), 405, 417, 492,
4K (2), 5120, 516, 600, 696, 707, 777, 800, 8888, 8_Tunes, 905, 948,
AIDS, AIDS II, Alabama, Ambulance, Amoeba (2), Amstrad (2), Anthrax (2),
AntiCAD (2), Anti-Pascal (5), Armagedon, Attention, Bebe, Blood,
Burger (3), Butterfly, Captain_Trips (2), Cascade (2), Casper,
Coffee_Shop, Dark_Avenger, Darth_Vader (3), Datalock (2), Datacrime,
Datacrime_II (2), December 24th, Destructor, Diamond (2), Dir,
Diskjeb, DOSHunter, Dot_Killer, Durban, Eddie, Eddie 2, Fellowship,
Fish_1100, Fish_6 (2), Flash, Flip (2), Fu_Manchu (2), Halley,
Hallochen, Helloween (2), Hide_Nowt, Hymn (2), Icelandic (3),
Internal, Invisible_Man (2), Itavir, Jerusalem (2), Jocker, Jo-Jo,
July_13th, Kamikaze, Kemerovo, Kennedy, Keypress (2), Lehigh, Liberty (5),
LoveChild, Lozinsky, Macho (2), Maltese_Amoeba, MIX1 (2), MLTI,
Monxla, Murphy (2), Necropolis, Nina, Nomenklatura (2), NukeHard,
Number_of_the_Beast (5), Oropax, Parity, PcVrsDs (2), Perfume, Pitch,
Piter, Polish 217, Power_Pump, Pretoria, Prudents, Rat, Satan_Bug (2),
Shake, Sibel_Sheep (2), Slow, Spanish_Telecom (2), Spanz, Starship (2),
Subliminal, Sunday (2), Suomi, Suriv_1.01, Suriv_2.01, SVC (2),
Sverdlov (2), Svir, Sylvia, Syslock, Taiwan (2), Tequila, Terror,
Tiny (12), Todor, Traceback (2), Tremor, TUQ, Turbo_488, Typo, V2P6,
Vacsina (8), Vcomm (2), VFSI, Victor, Vienna (8), Violator, Virdem,
Virus-101 (2), Virus-90, Voronezh (2), VP, V-1, W13 (2), Willow,
WinVirus_14, Whale, Yankee (7), Zero_Bug.
Copyright (c) 1995 Virus Bulletin Ltd, 21 The Quadrant, Abingdon,
Oxfordshire, OX14 3YS, England. Tel: +44 (0)1235 555139. No part
of this publication may be reproduced, stored in a retrieval system,
or transmitted in any form without the prior written permission of
the publishers.