Test lab AV-Comparatives has published the results of comparative testing of 20 antivirus products for detection of malware and false positives. The test used the 240,859 malware samples.
Participate in a comparative test AV-Comparatives is limited: no more than 20 internationally renowned antivirus vendors have agreed to participate in a series of public tests of 2012.
In September 2012 in the comparative testing to detect malware scan on demand was attended by the following antivirus software:
|• AhnLab V3 Internet Security 220.127.116.11
• avast! Free Antivirus 7.0.1466
• AVG Anti-Virus 2012.0.2197
• AVIRA Antivirus Premium 18.104.22.1683
• Bitdefender Anti-Virus + 22.214.171.1246
• BullGuard Antivirus 12.0.231
• eScan Anti-Virus 11.0.1139.1225
• ESET NOD32 Antivirus 126.96.36.199
• F-Secure Anti-Virus 12.56.100
• Fortinet FortiClient Lite 188.8.131.522
|• G DATA AntiVirus 184.108.40.206
• GFI Vipre Antivirus 5.0.5162
• Kaspersky Anti-Virus 220.127.116.1190
• McAfee AntiVirus Plus 11.6.385
• Microsoft Security Essentials 4.0.1526.0
• Panda Cloud Free Antivirus 2.0.0
• PC Tools Spyware Doctor with AV 18.104.22.1688
• Sophos Anti-Virus 10.0.8
• Trend Micro Titanium AntiVirus Plus 6.0.1215
• Webroot SecureAnywhere AV 22.214.171.124
Collection of malware was “frozen” 21, August, 2012 and consisted of 240,859 samples. Antivirus has been updated before the test August 28, 2012.
Note (Information on the use of additional third-party anti-virus engines / signatures): Bullguard, eScan and F-Secure are based on the engine BitDefender. G DATA is based on the engine, Avast and Bitdefender. PC Tools uses signatures Symantec.
Most antivirus programs are run with the maximum settings by default. Some antivirus automatically switches to high settings, when found malicious files. This makes it impossible to check protection against various malicious programs with the real “default.” In order to obtain comparable results for some products were found high settings or settings have been reduced – in line with the recommendations of the vendors.
Part of the anti-virus uses cloud technology that require an active Internet connection. AV-Comparatives tests performed with an active Internet connection. Users should be aware that the level of detection may be in some cases significantly lower than if the test is performed without connecting to the Internet (or the ‘cloud’ is not available for various reasons). “Cloud” should be considered as an added advantage / feature to increase the level of detection (as well as the response time and reduce false alarms), and not as a full replacement for local signature-based detection.
The test results
Graph missed malware samples (lower is better)
Overall detection rate
1. G DATA 99.9%
2. AVIRA 99.8%
3. Panda, Trend Micro 99.6%
4. F-Secure 99.3%
5. Kaspersky, BitDefender, BullGuard, Fortinet, eScan 99.2%
6. McAfee 98.8%
7. Sophos 98.7%
8. Avast 98.6%
9. GFI Vipre 98.5%
10. AVG 98.0%
11. ESET 97.4%
12. AhnLab 95.6%
13. Microsoft 94.9%
14. PC Tools 94.4%
15. Webroot is below 80%
Graph of false positives (lower is better)
Results of false positives (False Positive)
1. Microsoft 0
2. ESET 4 very few FPs
3. Kaspersky 5
4. Trend Micro 7
5. AVIRA, BitDefender, BullGuard 10 several FP’s
6. Avast, Fortinet 11
7. McAfee 12
8. eScan 14
9. F-secure, PC Tools 15
10. Sophos 19
11. AhnLab, Panda 20
12. 23 G DATA many FP’s
13. GFI Vipre 34
14. AVG 36
15. Webroot 210 too many FP’s
Level awards in testing
* These products have won awards because of lower false positives.
Awards take into account not only the level of detection, false positives are also considered when checking clean files.
Antivirus, which has a high percentage of detected malicious files, but it suffers from false alarms, can not be better than an antivirus that detects malicious files smaller, but produces fewer false positives.
Comparative testing of antivirus AV-Comparatives October 2012: Full report.