AV-Comparatives tested 20 antivirus programs

Posted: October 20, 2012 in Testing
Tags: , ,

AV-Comparatives logoTest lab AV-Comparatives has published the results of comparative testing of 20 antivirus products for detection of malware and false positives. The test used the 240,859 malware samples.

Participate in a comparative test AV-Comparatives is limited: no more than 20 internationally renowned antivirus vendors have agreed to participate in a series of public tests of 2012.

Tested antivirus

In September 2012 in the comparative testing to detect malware scan on demand was attended by the following antivirus software:

• AhnLab V3 Internet Security 8.0.6.13

• avast! Free Antivirus 7.0.1466

• AVG Anti-Virus 2012.0.2197

• AVIRA Antivirus Premium 12.0.0.1183

• Bitdefender Anti-Virus + 16.18.0.1406

• BullGuard Antivirus 12.0.231

• eScan Anti-Virus 11.0.1139.1225

• ESET NOD32 Antivirus 5.2.9.12

• F-Secure Anti-Virus 12.56.100

• Fortinet FortiClient Lite 4.3.5.472

• G DATA AntiVirus 23.0.3.2

• GFI Vipre Antivirus 5.0.5162

• Kaspersky Anti-Virus 13.0.1.4190

• McAfee AntiVirus Plus 11.6.385

• Microsoft Security Essentials 4.0.1526.0

• Panda Cloud Free Antivirus 2.0.0

• PC Tools Spyware Doctor with AV 9.0.0.2308

• Sophos Anti-Virus 10.0.8

• Trend Micro Titanium AntiVirus Plus 6.0.1215

• Webroot SecureAnywhere AV 8.0.1.233

 

Collection of malware was “frozen” 21, August, 2012 and consisted of 240,859 samples. Antivirus has been updated before the test August 28, 2012.

Note (Information on the use of additional third-party anti-virus engines / signatures): Bullguard, eScan and F-Secure are based on the engine BitDefender. G DATA is based on the engine, Avast and Bitdefender. PC Tools uses signatures Symantec.

Most antivirus programs are run with the maximum settings by default. Some antivirus automatically switches to high settings, when found malicious files. This makes it impossible to check protection against various malicious programs with the real “default.” In order to obtain comparable results for some products were found high settings or settings have been reduced – in line with the recommendations of the vendors.

Part of the anti-virus uses cloud technology that require an active Internet connection. AV-Comparatives tests performed with an active Internet connection. Users should be aware that the level of detection may be in some cases significantly lower than if the test is performed without connecting to the Internet (or the ‘cloud’ is not available for various reasons). “Cloud” should be considered as an added advantage / feature to increase the level of detection (as well as the response time and reduce false alarms), and not as a full replacement for local signature-based detection.

The test results

Graph missed malware samples (lower is better)

Schedule missed malware samples

Overall detection rate

1. G DATA 99.9%
2. AVIRA 99.8%
3. Panda, Trend Micro 99.6%
4. F-Secure 99.3%
5. Kaspersky, BitDefender, BullGuard, Fortinet, eScan 99.2%
6. McAfee 98.8%
7. Sophos 98.7%
8. Avast 98.6%
9. GFI Vipre 98.5%
10. AVG 98.0%
11. ESET 97.4%
12. AhnLab 95.6%
13. Microsoft 94.9%
14. PC Tools 94.4%
15. Webroot is below 80%

Graph of false positives (lower is better)

Graph of false positives

Results of false positives (False Positive)

1. Microsoft 0
2. ESET 4 very few FPs
3. Kaspersky 5
4. Trend Micro 7
5. AVIRA, BitDefender, BullGuard 10 several FP’s
6. Avast, Fortinet 11
7. McAfee 12
8. eScan 14
9. F-secure, PC Tools 15
10. Sophos 19
11. AhnLab, Panda 20
12. 23 G DATA many FP’s
13. GFI Vipre 34
14. AVG 36
15. Webroot 210 too many FP’s

Level awards in testing

Level awards

* These products have won awards because of lower false positives.

Awards take into account not only the level of detection, false positives are also considered when checking clean files.

Antivirus, which has a high percentage of detected malicious files, but it suffers from false alarms, can not be better than an antivirus that detects malicious files smaller, but produces fewer false positives.
Comparative testing of antivirus AV-Comparatives October 2012: Full report.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s